Arugela

Members
  • Content Count

    790
  • Joined

Community Reputation

167 Excellent

About Arugela

  • Rank
    Sr. Spacecraft Engineer

Recent Profile Visitors

2,399 profile views
  1. The fun part is that if the one description is accurate and nano computers are only millions of times faster. That put some estimates for potential calculations for light based stuff at around e15th in speed at 1920x1080 resolution. That is right between gigahertz and nano computer supposed speeds. And it's scalable. So, I'll assume it's a matter of cost. I've always liked the idea of light based computing. It's potentially endlessly scalable which could be useful. Even though I'm sure not in my lifetime. And I think scaling it to double the size(4x) at 3920x2160 put it in the same minimal range of e18th calculation speeds for quantum computer calculation speeds.. So, that is doable with modern video card display sizes. That is a pretty nice potential setup. We are not far off. Some stuff was saying light based cpus in a decade. Not that that usually works out that conveniently. maybe in a decade if I'm lucky I will have a light based system. Not to mention the combination stuff invovling light based on a current cpu. I think they meant in the die with the tarnsisters and not just as the bus. That would be interesting. So, yes it can be supplemental or even replace potentially. It's possible it could out do quantum computers potentially. They don't seem to be far off. Especially as light based is potentially infinitely scalable. Although I'm assuming quantum is even more power efficient. No, idea how scalable it is. I missed something on that if it was in the stuff I read. I'll assume light based systems and qauntum will go together at some point as the lack of lag will probably be helpful. Unless there is something better out there to speed up copper. Also wasn't sure where in the nanosecond range of response time microled was supposed to be. That could throw it off a bit. And I think the one article said someone got a pico second signal to the gate. I don't remember what it was. I think it was to get the signal.(or before the signal could be redone? is that the same with the flourescent gate? Not sure if it means an electrical signal or before the fluid goes back to a state it can be read again or the sensor.) https://etd.ohiolink.edu/!etd.send_file?accession=miami1375882251&disposition=inline https://www.chromedia.org/chromedia?waxtrapp=mkqjtbEsHiemBpdmBlIEcCArB&subNav=cczbdbEsHiemBpdmBlIEcCArBP https://www.ncbi.nlm.nih.gov/pubmed/18699476
  2. That means it's a partial theory. And what you are saying is illogical and unscientific. And the theory of probability doesn't actually say what you are saying it does nor does any serious scientific theory or any theory running by the people making quantum computers or any other hardware. The only way to not figure it's data is if you can't under the circumstances. What you are saying has nothing to do with real theories. All you are doing is mystifying theories and turning them into your personal fantasy like a lot of very ignorant people today. And everything I said is true. And you are one of those ignorant people. Trust me, I've seen this develop many times in the past. We used to deal with this in forums in the past like swatting flies. Over and over again. Your generation just takes longer like everything else with your generation. Slow as pudding, in every way. 8) Your ability to figure it out has nothing to do with wether it can be figured out. You don't have the slightest grasp on anything you are talking about. there is no such thing as probabilistically except having to repeat calculations for accuracy. Which still has to be done in a way that ensures and outcome. And only those parts will be used for calculations. That is how actual computing works. And everything else. You and a lot of people need to get a serious grasp on reality. Computing only happens when you can predict something. That is how quantum computers work too. Again, the ability to personally figure it out from senses has nothing to do if it's actually possible to figure something out. We don't use sense to pick up thing outside our sense. We make equipment to do so. You are confused as to basically everything on a fundamental level. There is no probabilistic outside of the determinism you are talking about. Even if our physical biological senses can't pick it up it has nothing at all to do with wether it's deterministic or not. That is literally a matter of cause and effect which is outside of that.... That is merely how our senses work because that is how everything in the universe works by definition. You are literally saying to stop using reason and thought to figure something out. You do not understand what you are saying. This is why deterministic is the root word determine...
  3. There are no real random numbers. Real life is not random either. It's completely predictable. It's just a matter of how much you understand personally. It would just produce another form of random. Assuming it's even different. To use anything for computers you have to be able to predict it. That is literally how you implement it by definition. Therefore it's no different than a normal computer. It's just a different form of circuitry. You and a lot of people here seem to not fully understand the concepts you are using and are using fantasy as science. It doesn't work out. Everything comes down to inputs and outputs. Anything equalling something equals it. It doesn't matter the means to produce it. But modern computers and software are extremely pathetic and not hashed out well. Or at least not to a very advanced state for even basic things. Quantum computers are not probabilistic. Or your understanding of that is wrong. it's predictable just like everything else. That is how it is used as a computer component. It might be of a higher range of values but it's always predictable. Even if it's a part of a more complex varying set of circumstances. It's not different. At that point you are merely talking about complexity and range. But that doesn't necessarily work as well as you think when it comes to things like encryption and whatnot. It will still have the same issues that have to be dealt with just the same if not more so.(if you don't understand something and implement it someone else will figure it out and use the weakness.) It will eventually be understood completely(and already is by those making it.) and it will be the same as now. Just different. I will go gag and die for repeating that sentence now. Thank you for making me spew the most disgusting thing I have ever heard. BTW, larger and more complex does not equate directly to better. In fact it can be worse(And likely will be in practice.). Especially against the human brain. And it can lead the user to use it without fully knowing it to a worse level leading to more openings. Everything works out the same in the end for the most part. Or if you don't understand it completely it doesn't matter much potentially. As a test what is the supposed range of data or functional uses for a quantum computer. Set that to a bit rate or computation needs and look at how much it would take to overcome it with normal computing. BTW, that level of inflated funding and propagandising is not a good sign. Probabilistic is just a cache phrase to over inflate it's actual use and sell the idea. It's extremely pathetic. It's also shouldn't be needed. Which means one of two things. There are endless idiots involved in it's development or financially or otherwise which will reduced it's implementation. Or it's not as good as they are making out. Potentially meaning it will be outphased by other hardware and they know it! Or at least in it's current expected implementations. Either way, money and morons are involved. The M&M's of development! 8) Melts in your mouth not in your hands! AKA after you eat it not before!(or more the further you get into it's usage.) Hopefully not though, new computing devices would be nice. (FYI, psychologically, whenever you see advertising it's directly proportional to the perception or actual difference in the items value practically. We always respond in the same way. So, the more there is advertising the more you need to examine it. Unless it's purposely advertised less knowing this. Someone feels it's lacking to push it so much. And because our brains work on existing data in our head and we don't have the full answer the severity is always beyond what you are imagining. And more than you can imagine imagining.) Edit: Other fun thing with light based computation. As long as you don't have something out of sync time wise. You can add as many devices and manipulations along the path of the bus to get the same time to compute. Although that would depend on implementation. It would mean a slight guaranteed delay altogether, but if it increases computational power beyond the norm it could be worth it. that means you can use a string of light manipulating devices along a path. And mirrors and whatnot to do the same with high bandwidth data. To put it simply the point is to treat an xbit and put it as a full range of 0's and 1's representing the entire potential range of that bit combo. Each color variant or combo represents the entire data string in each potential combination. Then you can represent data in a very compact way compared to now. Or whatever other thing that comes to. Which should be quite a bit. Pun intended. Examples: 32 bit data x4. Each potential bit value is a color. Each color is a data string. Maybe it's correct to look at it as a bitxbit or something like that. It's doable regardless of implementation. It's turning small bits into large. Ciphers can be used if needed etc. Including active changeable ciphers. which means it good that one device had a much faster gate control... then you just need a bunch of separate read sensors synced up. 0 = 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 1 = 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000000 00000001 So that means a 1920x1080 at 32bits then gives: 1920*1080*32=6.63552e7per nanosecond which at microled speeds is something like 6.63552e16 per second or 58.935255667 petabits per second... Am I doing that write? Not to bad compared to current cpus. And 24bit is still 1920x1080x24=4.97664e7 per nano second=4.97664e7e16 or 44.2 petabets per second. And binary light is 1920x1080=2.0736e6 for 2.0736e15 or 1.8417 petabits per second max. Oled says 0.0001ms? I'm not sure what is meant by this?! do they mean 0.1 miliseconds or do they mean 0.1microsecond as in 100 nano seconds... That would make oled. 2.0736e13 or 188 terabits per second or as low as 2.0736e10 for 19 gigabits per second. Yes, I was wrong about the 32bit being read as half a gig without additional work. But you could do that if you wanted. You could also turn it into other amounts. But you could just use that also. Depends on how you can manipulate it. multiple nodes being read in one way. And increasing the bit count doesn't change the overall throughput. I was hoping I wasn't missing something on that one. Not sure who would win the console vs desktop vs handheld wars then... It might get weird. https://en.wikipedia.org/wiki/Deterministic_system BTW, if this is the definition of deterministic you are using. Everything is deterministic. The very basic of reason and all man made theories is determinism. Without it we have no standing logic or ideas. Anything beyond it is impossible as it is the definition of something existing and actual. All thing going into something are what make it. The common term atm is causation. That is the only condition in which reason or deduction is possible. In which case our brains, effectively deduction machines,(via sensory data) cannot figure something out. Everything people take as beyond it means they just haven't defined it. Also any device requiring deduction(IE anything mechanical, logical or physical) can't figure out past that. Nor could they or anything they are made from give them it. We do not know nor do we have a way of figuring beyond it. And it's literally the definition of non existent. If you see something beyond it it's a logical fallacy with your base assumptions or definitions. https://quantumcomputingreport.com/our-take/how-to-explain-quantum-to-your-classical-friends/quantum-computers-may-not-be-fully-deterministic-but-that-is-ok/ In this case they are simply idiots and should be removed from all research and anything touching it(I hate these sort of websites.). If they cannot predict it it means they have not yet measured it's cause. The point we use it for are the point we can measure it(which is just an inefficiency using raw speed and what I hope/assume is a currently limited set of completely predicable outcomes. Even if it's across multiple instances.). That is what defines a computer and how you implement one. It's the only definition. Unpredictability is only caused by a lack of knowledge. Not a lack of cause. There is nothing and will never be anything that acts that way. It literally cannot happen. It's literal fantasy! These people are completely uneducated. They are just making money off gullible people. What have and are becoming dangerously ignorant people over the last several decades. This is just junk science. Maybe tabloid science is a better term. That should sum it up for you. Edit: And millions of times faster isn't that much more in terms of binary values. Deterministic and probabilistic are fantasy terms for the gullible(or probabilistic is.). They are just implementing half assed designs they don't fully understand or are short cutting or lieing about outright. The entire thing could be fraud at that point. No other information withstanding. At least from the standpoint of that article writer. This is a load of crap and the person writing it is full of excrements. All you can do is take a really fast device and use predictive logic to get an answer(Which has to be completely deterministic). Else it's unusable as a computational device. Or it will be done on purpose for massive fraud and theft. That may be something to worry about when and if this is applied for research purposes. You may see a new age of theft and conmen. And if the device has the potential for it and they can hide it in any way it will be the case without question. It could be the reason for it's development on some levels. These things will always be used to their fullest. And always a soon as possible. I wonder if weird bit translation could be used to produce more correct geometric calculations. Or even ones with unreal numbers faster. Geometric processors are also needed. That would basically be unreal processors. Then they could make a true unreal engine. The difference between 32bit and 2^32 are odd values. I wonder if any produce close to ore can be used in ultra realistic version of stuff. And you can still use the color outcome to represent a complex set of outcomes to spit a single result at the program to translate as fast as possible to get your total outcomes. This allows a screen to display many outcomes. O ruse multiple pixels to effectively deal with a higher bit rate regardless. then it's like a D&D chart with millions of options. And you can use tricks to get extended tables and other things. Then you can get the results for each screen section if wanted. The trick would be combined outcomes. I guess the only way to get that would be really fast write... Although I think the one device had picasecond response times. If you could sync up reads to stammer the read to get the same as the write speed you could just do it quickly. https://www.extremetech.com/extreme/223671-heres-why-we-dont-have-light-based-computing-just-yet Well the bus idea was right. It could make a good bus system. So why not move to light base busses on mobos in general. Then you can intermix light and electrical processing units. Plus with light based if you can get a non soldered connection for the type of stuff I'm describing, it's removable and upgradable. So, a gpu having a light based bus translator or a more advanced unpacking device could do advanced work towards the bus or even out to the display. Optics would be convenient as a display connector I would imagine. One of the tricks I'm referring to he refers to as a photonic logic array. I wonder if you can't do the same with an abundance of different read heads in different locations or with different read abilities. 32^6 x4 is the same as the 2^32 I was using(32^6 is also 1 gigabit.). So, if you could use multiple nodes to work together in effect it would be like static ram of light base computing. And hopefully making it even more powerful with better devices. If anything suplimentary. Even if quantum computing overtook it it could be used for things that quantum computing struggles with because of speed factors. Basically slower things. Or as an early version of quantum computer output for similar computational tasks before it's common. Assuming it's useful before then. If you used it to store massive game worlds on it you could use it to speed up loading. Aside from issues like hacking or thing requiring a server to keep track of. Could still be used with fast enough right or equivalent size to help deal with that head of time I guess for load times. Assuming you can't just romove useful data like enemy locations. Terrain wouldn't matter unless there is fog of war to overcome. Someone also mentioned the potential advantage of making very large arrays instead of making multiple servers. This could save space and overall heat. In the end for server rooms and whatnot. And the 3d version of this idea is holographic 3d computing arrays. The visuals can be the computations or the results. You could have swirling cool holograms doing your computation for our system. Or does that just make the same computation as you have to read the output from the other sides... I guess you could use 5 sides for computer and a 6th for read. Or all sides for compute in massive 6 set of arrays and a bunch of external reads via either another light with read or something else. I guess that is not limited. You could also use natural light as a source... It would kill gaming and anything done late night, but... BTW, how many detectable light colors or variances are there in natural light maximum. This includes UV and other arrays. How much could be used as unique for a computer hypothetically. Also including all qualities of light detectable. http://sciencenordic.com/optical-computers-light-horizon This is interesting: I would have thought that was too hard for modern cpu's. I'm assuming it's a larger tarnsister size. https://news.berkeley.edu/2015/12/23/electronic-photonic-microprocessor-chip/
  4. It doesn't matter what the actual means of producing the light is. The point is the ability to read and effectively produce the range of colors to have a bit with a range of an actual 32bit value(Although this is just an example. 24bit seems to small for a minimum although any and all and multiple bit lengths can be used and even switched between for any useful purpose.). As long as it can in any way or multiple ways produce the light it's fine. In fact the more the merrier. I didn't think this was complicated enough to have to explain much. If you can display or calculate a bit with a range of colors you can get the amount of given data representation. What is there to explain?! Once you get correct colors or represented data you can then use it to collapse and translate data into actually smaller bits. Packing and unpacking data from much smaller data. then you can expand current data or optimize current tech if necessary in new ways. Either way you need ways to get the data out efficiently. So, then you have fun with logic. You need to figure out all ways to do this anyway. So, it's kind of par for the course. Not to mention when and if to translate between optical and other means. But it could open up things. Instant or near instant ways to do current tasks that take a long time. If the logic is faster or can be made faster you can use it with current tech to handle data fater or shrink download sizes or encrypt or whatever. The last Idea was to use sound. if it's already using actual 24bit it's an easy translation and you can use more of your hardware to complete current tasks. The idea of much larger functional amounts of ram open up a lot of things including more complicated physics calculation and thing effectively similar to quantum computer output if desired. If you display a light that is in the range of an effective 32bit color range that color then represents a large string of binary. That is data. You can actively keep up with sufficiently fast hardware with light display or effectively fast means. Depends on the use case and what can or can't be done. If you can't change data it allow large amounts of data to be displayed without a refresh or change of data effectively. There is a lot you can do with that given out it allows expansion of data into 32bit representation live. You can also try to do the same thing with cache ram in a cpu and two 64kb of cache ram or similar. It's not about selling something. It's a matter of fact issue. I'm just not sure where the downsides are for it. I'm sure there are some or many but I'm not sure how many ways there are around them either atm. It might be suprising. It usually is. The oddity is the things I've seen on light based computing are all low value binary based. I don't see why it's not used to a higher bit rate. That is one of it's strengths potentially. this concept allows any system to be maximized. No more partially used buses etc. And much more profound software. Even with current hardware. Edit: Maybe this is where the confusion is coming from. I'm saying represent the 32bit color range with color each representing a string that is half a gigabyte or 4 gigabits in size. So unpacking 32bit into a larger value. Each string would be made up of a unique string of large data representing a unique half gigabyte string of 0's and 1's. IE the full possible range of 32bit as a large fully fleshed out potential range of data combinations. Then translating back and forth as needed or pulling data out. Each color in the spectrum representing a large string of data. That allow much larger data to be used. And if it can be packed into 32bit or whatever sizes it can extend existing systems past their bit limits potentialy and have more ram in essence enhancing existing computers. One fun things you can do it not caclulate advanced physics but display the colors and shapes of the results straight to the screen. Then with enough knowledge of the results the screens result data could then have the program respond in a much simpler way.(this would have to be done to the display on sending the image also.) Bypassing complex physics results all together. Not sure how to accomplish it though. I'm sure it's been gone over in the past though. If anything it could be combined with normal physics to supplement it if not entirely replace it. It would probably be circumstantial. And if the last 8 bit of 32bit color is for transparency it doesn't matter as long as it's a unique readable color. Or effectively is one. Another trick is to have a way to read the lights individually and over multiple nodes or a single node if it's enough. Then use that as an effective computation device to get the equivalent of a larger data string. This allow potentially instant computation depending on circumstances and size of data to get what is one of the main points of such a device. The potential to do all calculations within a single instance. Or as close as possible. Having multiple read sensors could help accomplish this meaning write is minimally needed but ultimately useful. Anything predictable can be used to calculate where needed making it so you can customize software to get the needed results. So, the more read points with unique outcomes the better. Then you can do the same with things besides the light itself. Reading heat or other qualities of the light or device that can also be to check accuracy of data could be used as a calculation method and purposely used depending on circumstances. The more ways to read the better. For instance: If you have a node made of 4 led's. If you can read the light of the LED of each individual light and you can read the end results and you can read across multiple nodes you can use a method to use prediction to calculate something. Both by the combination of lights being used and the outcome and multiple outcomes based on how you display it. This can be done by writing/displaying fast and reading or by having large amounts of written/displayed data sitting to be read from the correct place in the device to find the results. Either way it works. The problem potentially with slow write is that you are more likely to run out of effective space as it's limited to as low as the devices maximum size and usable space depending on the application. You can also use this to maximize multi core processors by flooding it at 100% at all times and using a more predictable flow of data to use the large chunks of data to string together and outcome. If all data is one big chunk of data and it's tagged internally you should be able to more easily string the outcomes together and keep multiple cores in use. Even if (large) parts of the data is nothing but 0's and empty fill data. So, you will probably want more power efficient CPU's or something to reduce the power consumption somehow. Maybe the predictability will lower cpu usage. I guess it depends on the software. There could be tags to not calculate certain pools of 0's so it doesn't actually use power on empty data. Then other 0's would be part of the calculated data. This could massively reduce power use. then the data is also never used differently making it predictable but not spitting out nothing so it's still easy to find in the same manner with less power usage. Or specialized cpu's could treat the 0's in a manner not to use data. This could lead to simpler cpu designs or more specialized ones meant to reduce power usage. That could help to literally or effectively do multiple calculations at once to get complex results. It's all a matter of software design. Or to whatever extent hardware abilities.
  5. Is this what is used in sound files now? If so can't we just do this with software to shrink down files sizes for download and upload? https://www.bbc.com/bitesize/guides/z7vc7ty/revision/4 Couldn't we just use our audio equipment then to accomplish this to some extent. If not have the CPU use cache ram to help decode things. I think 32bit only needs at worst two less than 64kb chunks to decompress a file. I wonder if we could use dial ups again if the info is small enough. 8) Back to slowly downloading massive files and waiting eagerly to see the results!! ><
  6. Now, when is 128 bit support coming? Surely supercomputers and servers might want to run this game. If you aimed at making it run well on them you might deal with issues on other computer types! ;d
  7. So, if you just went with the representative data instead of the hardware. What type of things could you process. Where are the limits. 1 pretend you have an image file of a 1920x1080 monitor at 32bit color depth. This is logical data representing the screen display. You could then apply logic to analyse it to pull out data with modern hardware. This is imediate increase in computers size and power based on just software logic. 2. Can you put in and image inside an image. This just changes the data. This then makes the data depth a matter of software translations. This adds both depth and breadth to data and natural encryption methods with each layering. The deeper the more complex the encryptions. It also means all modern networks can send infinitely more data than previously. to get the desired data you would simply translate it twice or treat is as a deeper layer of logic inside the existing image data. And each layer coud change the bit depth to disguise the data. That could be infinite data patterns to find the correct data. Natural infinite encryption based on computational abilities. Especially if time is no objects. This added to modern encryption means. What are the limits and problems with such tasks? Would quantum computers help find data or would it not be needed. I would there would be ways if you know the structure of the data ahead of time to find the exact data quickly based on pre known circumstances. How much could this be applied right now to all software? https://wccftech.com/microled-vs-oled-everything-explained/ https://www.displaydaily.com/article/display-daily/microled-emerging-as-next-generation-display-technology If not this is supposed to get actual NS response times with no burn in and long life span. Just might be pricey. But if it gives you a petabyte of vram then why not. If it can give you infinite encryption and ram like speeds then who cares. If you get enough response you can simulate even deeper bit depth and get much more interesting data retrieval and other functions maximizing use. Even using them as a processor depending on light manipulation methods and if you can make up for the relative difference to CPU speeds. And if power savings is a thing the price could be worth it. Maybe. Imagine a hybrid microled/hdd or microled/m.2 drive. with the data manipulations I think at one layer at 1920x1080 at 32-bit you can get 66 thousand times the data storage for a 4 TB drive with compression. And more complex things could probably be applied. You are talking near infinite space and potential computational abilities. Imagine a 128 bit layered image data that is translated. Each layer has a random bit depth to translate through. Each layer is also encrypted individually!!! You need to apply each layer correctly to get the data or nothing!! and any other method could be applied to make it more complex. I'm sure ways would be found around it in abundance. But it could be hard where and when it is hard. this from one basic image. I wonder if the weakness is in the end image and it's predictability. I imagine though, with ns response times decrypting the data you could add thousands upon thousands of layers or encryption. Let alone applying increasing forms of encryption on each layer and it keeps getting more complicated fast. Any native machine logic to go over the image layer could also simultaneously be applied with the computation of the led device for multiple purposes. Your taking binary into a 3d realm of processing functions. I wonder if you can use compression tricks and these devices to speed up or lower overall processing difficulty for monitors and vram to allow displays to more easily display without bogging down the gpu with multiple displays. Probably have to get past the response times. That could be a use for fast read and large displays. You could use fixed overly strong image to translate data like a crypt just for enlargement or something. Maybe security is the issue. Der... Ok, I was thinking you could represent a screen color combo in less than full 32bit data sizes as a smaller bit length and it reduces size without doing anything special.... Not sure why I was thinking this. But there should be tons of ways to reduce the data count. Especially with combined color representations. Or deductive notation. OK, it can be reduced as it can be represented by 32bit. 1. Consider the color result of multipe colors. This can be stored as a single color to reduce overall size if you know what to translate it back into. Then you just need a cipher to translate everything back. Reduction in size. (Possibly any size combos) 2. A notation system. Break down the colors into grids. If you break them in half and not 1/2 it can then display the grid mroe quickly as it uses half the colors and find the grid. 32bit could at minimum be half the data size with a single bit. Each reduction of the color spectrum then produces a smaller bit to go with it. This reduces the size of the data. (exponential cipher) Example: 32 bit represented in two 64kb sized numbers((64x1024)^2)(1x1). One represents the notation and one the represents the color grid. 32bit colors represented in two 64kb chunks. You could even store it easily in a cpu's cache at that size. A gain of 32,768 times the storage. 32^3 times actually.(Maybe a fast emergency cpu onboard display logic for startup if you don't have a gpu present?) You could do the same with two 64 bit values. A gain of 2,147,483,648 times the storage. And 64^4 is 24 bit. It could have easy conversions.. 3. Could you break the exponential ciphers into smaller exponetential or other ciphers?! How about 2x2 8 bit values?! Which can be reduced down to 4x4 4bit values?! This could be broken down into 8x8 2 bit values? And finally 16x16 1 bit?! Can you do less than one bit? 8x8 bit is 24 bit again. Making conversions fast potentially. Depending on the color breakdown. Or do it logically instead of by color changing. Or lay it out for conversion on the screen differently. They could be displayed on different spots in order to collect or change the color value for read in many different ways. Would the physical devices help. At minimum you could store more data and have it held in smaller chunks on an HDD. or could you get equivalent from a cpu/gpu now? I would think the advantage is easy display and then reading/manipulations of large color spectrums for fast processing. Or is the advantage whenever you are past certain aspects of a current system. Like maybe it's bit value. Or a convenient bit value to process via the current system resources. Assuming it can't be done with smaller notation. I'm forgetting. The advantage is copper translated into light for quick transmition. Plus the connector could be non soldered and conected through something as easy as a video connector from your gpu. then transmit or reacive with optics so no soldering is needed. As long as you can a fix it properly. Am I missing something else? I might be messing that up again. Would this not change moore's law from double to exponential or more with each iteration?! It would potentially be an increasing rate potentially at an increasing rate over time. If so, screw moore. He was an idiot. BTW, do we use this method now anywhere? I would think gpu's might.
  8. I am assuming there are ways to widdle down this sort of thing though. If you have read node that are not isolated but can read across the spectrum and you can change the light you can ultimately apply a very large amount of methods to keep it up to par and get close. You could logically design the device to do something like that specifically potentially. Although I would design for maximum usability and function as it's one benefit it massive data storage and representation. You could apply filter logic in a structured environment to get things you need. The idea is you would design it's use around knowing the total of what you are doing with it to start. Plus 1:1 or even greater read ability per node depending on how you made it. I would think there are very interesting mechanical and other means to process light and em and other things. One thing you could do is read multiple aspects of the light or more than one thing at a time as a way to store data and use it. It's not limited to just light. Reading multiple qaulities(or deciphering it) of non physical things as data could be combined and read at once. How many ways can you read light or other aspects needed to produce it. Every point could be read and use as analysis to speed seek up. Not to mention manipulation as calculation or anything else usable. There would be a hell of a lot of things you could do to it. Plus this would have many other uses. If anything, if you can read the screen physically and store the data you could still use it as display. You just have to keep the read device up to date on screen changes so it can still read the data in it. It's basically combing all the things in a current computer into one device. So, it should be usable by all devices in a current computer and a good companion to them. On top of anything else added. If it's done well it could be a good companion to quantum computers also as it could be made to do heavier computations than a binary device in a smaller space as the bit depth usage should add more functionality and hence power. This kind of tech should scale with computers also. It should be able to use most advances that other devices benefit from in some way or another. If it's equal as ram to modern HDD in space it might stay that way over time. At least as far as the end results. There would be more dimensions to play with to keep it up to par. And how this works obviously already exist all over the place. It's just applying it. https://en.wikipedia.org/wiki/Nanosensor https://en.wikipedia.org/wiki/Nanophotonics Combine all applicable levels and you can increase processing or other abilities. There are no end of ways to deal with the data and get what you want out of it. It would be even cooler if it could melt parts of itself down and remake it with lithogrophy internally. Some parts could be regenerated and corrected for mistakes. This stuff doesn't even have to be accurate. It just needs to be predictable. https://ieeexplore.ieee.org/document/8409924 Is one advantage of fiberoptic or other light transmitting data methods the lack of need for hard joint. IE if it breaks you can replace it and because it's sending light and doesn't need a soldered connection? that would make the tech easier as it's replaceable by definition. As long as the light transmitter can be aligned or whatnot. And you could light along the whole things and manipulate it to remove certain issues. Or adding to it potentialy. I guess it depends when you translate the info. You could combine it with soldered old fashion ship joints if you needed to send some by cable and have it hold the fiber optic lenses in place to send the data. Or something odd. Actually downside might be security: https://www.google.com/search?client=firefox-b-1-d&amp;q=Steganography Although if you can secure that it could be a nice tool. You would have too anyway.
  9. I said alternative because you can accomplish the same thing with massive ram and charts of the total end results of all calculation combined in all potentials. The limit is how much ram if you can't refresh fast enough. That can also be worked around. I'm not derailing anything. This is how you get massive amounts of ram now. Higher bit rate is the mechanical means to accomplish an alternative. The point is that quantum computing is logically(the datas end result) the equivalent of a 3d chart of results representing complex data outputs(IE real life data results). This is the 2d version. If you know the range of things involved and potential end results you can combine display all combinations and get the same results. You could even test quickly for things outside of your parameters fast. Qauntum computing allows massive simultaneous data being worked on. So does this. It could even be ram for quantum computer processors. It has potential for greater data flushes etc to go with more powerful future pc's using much greater data flows. And I'm sure there are more efficient ways to do it. Sorry, but this is how I started the thread. You guys really hate new concepts. You would think people would like the idea of massive Vram and ram on your computer. And, yes, this can get up to speeds for modern computers. The downside is probably power or longevity. Maybe heat. Or price. But with how much we can do to compensate for the downsides it can probably be made much faster than current technology. And you apply anything from monitors, ram, or hdd's as tricks to it, minimum. You can probably even use it as a CPU if you wanted. Maybe a 3d layered CPU. Size is less important when you achieve greater bit rate and throughput potentially. If the downside is price it would be massively useful in areas where that is less of a concern. You could speed up very large workloads exponentially. It would allow bus systems on mobos to be maximized at all times and systems to maintain maximum throughput potentially. We did run computers like this once. This would just be going back to some old tricks. Not to mention the ease of compression and mass storage. It would make all server loads exponentially cheaper and lighter. You wouldn't need anywhere near the current permanent storage. Or you make the current exponentially greater. https://portal.research.lu.se/ws/files/2711949/2370907.pdf The other advantage is you can start applying techniques like this to something as simple as your phone. You could get expensive diagnostic results from your phone and send the results to your doctor as a cheap or free initial scan(probably could now.). Then better equipment could be used later in a lab if needed. Anything using any aspects of this tech or similar could be made available at home. You could also store the state potentially for read with power off in a form of hdd/dvd/flopydrive. But it's probably easier to send the data via image data as it's smaller and faster. But it could be used and dockable data or as a way to deal with poweroffs or low power mode or other things. Very useful if used as cache. To speed up processing you could use tricks to manipulate light directly at some point and get it to the data form that is needed. Or any other aspect of the device. If you had fast write you could even change the light value of the light as a way of computation. Many things would be possible if it has multiple ways to keep track of read/write and so on. Bend/change the light and get to the part of data you need.
  10. But the problem with world rotating around multiple things on each machine is not really an issue potentially. That was one of the main drawbacks people couldn't figure out from what I've read. The problem is mute. Multiplayer can be done logically like I said. So, it's doable. It's more a matter of lag at that point like all multiplayer. And the physics doesn't even need to be live. Your computer could run the phsyics like other objects. It' should be more a matter of how much data is sent vs how much has to be processed live. Which is doable no matter what. Only the N-body phsyics is a potential problem with exponential physics from parts count is in the way of multiplayer then. Remove that and put nbody physics and on rails planets and you can do multiplayer even if you can't now. Which you could if you don't care about lag when getting close to other players. Then it's probably a matter of game content for multiple people being added. Which regardless shouldn't be that bad as you could stay away from each other during the game. Or use smaller ships if needed. Or not if you want to mess with somebody! >< Those are your words not mine. You edited my statement to, "just," and insinuated. Base logic stands! Technically lag is less important in this game as you have less points where it matters. You don't have to spend time near other players. If fact it's much less likely unless you really want to. Which I'm sure would be common. But I would imagine that is not as huge of an issue unless you have very large ships. And you can always have settings to stop physics of other players on your machine to void it in many ways. You could simplify their physics burst probably also. You don't need perfect timing in most cases. Just close proximities. Although more accurate would open more doors for things. But you can have settings to allow people to set for different circumstances. Most, if not all, of the stuff to do this is in game.
  11. Either way, the real point is a higher bit conversion machine(and instant mass read functions). In this case using something common and getting lots of ram/storage. It only needs to have enough space to display some information semi permiently and could naturally have large amounts of space from the high bit representation. Like I said, 1 1920x1080 display at 10.8m is a petabyte of information at 32 bit depth. It rapidly grows from there so a few bit depth difference increases it exponentially. If you simply load all info at the start of a program this would give huge data for vram addition. Basically extended Vram. They already put those needless displays on their expensive mobos and cards. Why not make them useful. They could add to mobos also for extended system ram. Or double extended system ram. It could also display system info like a screen inbetween uses or temporarily if needed if you can use it without making the pixels invisible. This would work in tangent with other dynamic system ram by having a quicker pull or greater space than hdd depending on use. Programs could program into massive displays to store info for CPU/GPU. Especially if it's faster than and HDD or M.2 or other permanent storage it could be invaluable for system designs. And lots of stuff can be done with excessive system ram. Especially if it can access and dump from all point of storage simultaneously. If each node can be read and transmitted at once that is massive gathering time for information. You could maximize bus usage without a problem. You might even want to add fiberoptic or other faster buses long with the regular. Oled even has 0.1ms refresh. It's not the same as other parts of the machine but it's pretty good considering you don't need to change the display at basic functions. If you do then you can even use it as base system ram. Outside of longevity of parts. But that could probably be dealt with. Interstingly, 24bit makes it equilivent in capacity to modern hdd's. It could be a dump for HDD access when the system is on if it has faster access times. 32 bit coud be for larger capacity for super use. Especially if you can use it to compress at a higher bit rate. This means the normal hdd's can then store data for any higher bit rate in image data and then have it used for things we can't do today. Like display petabytes of instantly readable static ram for software. Or do smaller backups converting from 24bit for HDD backup or even 32bit compression. I'm sure more could be done. Then run them through the system for live instant display and read of massive amounts of data. and even backup utilities. Backups would be very different if you didn't need 1:1 to store them. Or the limited monder compression via removing spaces etc. Although those could also be used and these tools could help speed that up with the right logic. The write functions don't have to be fast as you can compensate with mass data. The read functions and getting the data out are. and if you can read all nodes that can do things ram and hdds can't as easily. Plus, even if HDD's like M.2's got close to as fast you could use them as write objects as to not wear out the other storage. I would assume monitor tech, even if modded heavily, would probably have longer living write functions. They could even use it as cache for m.2 drives and place them on the back as tiny displays with multiple purposes. Might need to get it to faster write speeds though. Or use excess size to do tricks to simulate higher write speeds. Then you could write less to the final drive potentially. Although having faster actual display speeds might be good in that case. It could be very good supplementary tech. https://www.ncbi.nlm.nih.gov/pubmed/18699476 Not sure how relevant this is, but if it can't read light it could be a double reader on the other side of the light node to use as a display off read or similar depending on application. You could use stuff besides light. The point is to make greater than binary bit conversion/function for data. You can get stuff out of it for modern computers. It could help extend modern tech beyond it's current limits. If an oled was used at 0.1ms response and it displayed in 0.5gb/s (0.5e-9) isn't that logically capable of being faster then ram. Or made to work at the same speeds. You could use tricks on with the size of data and respone time and the ability to flush mass data to outdo ram in calculations or sync it up. It's effectively potentially faster in certain situations if you consider it from a standpoint of sending things at a 1bit rate. the bit rate is technically speeding up by it's factor. And mass data point can also make up for small data grabs with accompanying logic. Even the slower 0.1ms can be played with similarly to create an equally or faster function ram form. Although those may cut into the capacity effectively. But it would still probably be more and more versatile than ram. Although I would assume ram might be more stable in the long run or have other advantages. Probably power and longevity. Half a gigabyte of data at 32bit depth is 0.5e-8. and 0.01 response times for ram(is that correct) could be equivalent logically if the display is 0.1ms or 0.1e-6?! I don't think that is correct. But it is close. Even 24 bit would be close node to node if you think about total data sent. One trick for slower devices could be to layer logic in color patterns like an hdd so it's split between all nodes and you can take parts of it with each read device quickly and put it together to get much faster reads or seeks. It would probably have a lot in common with a spinning HDD. that layering could be logically modified into the image even from non shifted data fairly quickly. As you don't need to change one point at a time like an HDD. You can change all points potentially at once. Or relatively fast. This could speed up slower than ram into something equivalent if needed. Or relatively speed it up regardless for faster functioning in any situation. https://etd.ohiolink.edu/!etd.send_file?accession=miami1375882251&amp;disposition=inline Would any of this help? https://www.chromedia.org/chromedia?waxtrapp=mkqjtbEsHiemBpdmBlIEcCArB&amp;subNav=cczbdbEsHiemBpdmBlIEcCArBP If you have static solutions and you could excite it fast enough maybe you could use a dead read function to get it out. and have lot of little lcd like things with single colors. Or can you use this to change the light to make varied colors or other aspects of the color to change the read data? And even if the change in fluorescents is in the milliseconds can't the data depth be used to effectively get faster data. It could have it's own cache or be attached to system cache to help deal with the data flushes or something and get the correct data. And that would only be to overcome the inability to change data as fast as ram. Which is unnecessary if you don't rely on write speed but mass read of data. But if both can be achieved more could be done. The really fun part of this would be that those wild rgb computers with multicolors would literally demonstrate the power of the PC. The more rainbow capability the more you can do!! >< RGB WILL RULE THE WORLD!! You could literally buy the equivalent of a crappy rgb strip and stick it in your computer tower and upgrade your pc. You could wear crappy rgb wrist straps at a rave and dance around like an idiot while your pc does your calculations for your experiments. And the brighter you are the more you can do. It would be a whole new world! Just cover yourself in strips and have a small pocket pc and go on your way. VR in your pocket. Just need full on normal glass displays for real prescriptions glasses. Can stuff involved in spectrometry or similar read light or anything in the form of a high bit depth currently? I'll assume this much heat would not occur. Or I hope not. Monitors don't get this hot but I guess some variations might. I wonder how hot the stuff in those articles about spectrometry get. Some were using temperature to adjust the lights or something.
  12. If you only display the other person form your machine can't you just run them from your physics perspective. Why do we need to change the base game to get multiplayer? If each person runs the world where it revolves around them. Then just display the other person as an assets from their games perspective like all other object. Add physics when need to simulate the same results. Why is multi-player not achievable? Is it too hard to do that? The game already does this with any game you are not controlling. What is the difference? Live updating?!
  13. Can you not have each person render the game the way it is now then simply report their position to the other? If the visuals around you are simply processed on each time you just need to keep their others position and needed data up to date. Why does it matter that the other person renders by moving the world around them. You can do that locally. Only neutral data needs to be dealt with to then render the other player on your stuff. Their physics can be rendered on their machine and yours on yours... Why does the game even need to change how it does anything for multiplayer? All you need to do is represent the other player at close range accurately at minimum. This does not require you to have the game world move around the other player at all!! 8\ Could you do a limited or not physic representation that accurately just displays the results of their actions? Then to each their own as far as physics goes. You only need to see the results. Not calculate it on your pc. And calculating the interactive physics should not require you to calculate the world rotation issue either. Just enough info for collisions etc of the base model of the vehicle. In fact you could use that logic to get rid of parts count problems for multiple vehicles. If this is not already implemented. The oddity then would be two people in multiplayer with different physics mods. One with normal the other with more realistics air dynamics. Although that could be cool. You could live show the differences side by side. So, you could leave that open on purpose. You could also have an option to live compute the other players physics if you wanted too to some greater extent. Or go without.
  14. All things are measurable which means it can be represented by another type of data. 3d is he easiest. Figure out the bounds of potential data. IT maybe less than the set. But your representation has to be more. Then take the model used and turn it into another model. If it can't be translated to another thing it literally means it doesn't exist. It's fantasy. If not you misunderstood it. There is no thought experiment involved. The example is faulty. If it's quantifiable it's translatable. Period. else it literally can't be quantified. It has no parameters to be measured and understood. That is the literal requirement to use as a computation. You are completely mistaken. And if you use it to compute the only part you are using is the part you can predict which means you can quantify it. That is the very definition of computing something. if you don't know enough to translate it you literally don't know enough to use it to compute something. That is literally how you do it.
  15. That is about how I imagined this when I first thought of it. It was probably from reading about those at some point. 8) Originally I wanted a permanent state that could even be in power off to save electricity and a read(light detection) on state for error checking. That would make it ram and hdd. And if you do it with enough bit length and enough data you don't need constant refresh. You can refresh at start of a program and not need to do anything until the end of the program. Basically you could read from both ends. The permanent state could be retrieved and the light detector could read for error correction in real time. Assuming it could be done at the correct speeds. Although error corrections would be more complicated than in normal ram etc. Unless the light read time is faster. I guess it would depend on the hardware. Maybe they could design monitors to allow the gpu to read data from it as extended video ram. Although you might want unused parts around the edges or some other hidden place. Maybe extra pixels around the edge. The frame allow sight of the normal resolution and the rest if hidden for computational purposes. they could even hide sensors in the frame and use software to detect accuracy in the light range and filter out data from the surrounding image correctly. Or just read the last state. That is a lot of free ram potentially. If you use it as as slow or no changing data it might help deal with certain problems. Although I guess it could be the opposite. NVM, you said buffer. I was thinking the state was preserved in the led. Either way, that would be cool. Have they considered adding mass extra buffers and using them like this? Or is that a part of modern GPU compression? Would you still need to read from a visual device to get compression/decompression to extend storage?! I wonder which would be cheaper. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.580.2521&amp;rep=rep1&amp;type=pdf Definitely the same basic method to handle data. 8)