Jump to content

32bit Optical computers.


Arugela

Recommended Posts

https://ieeexplore.ieee.org/document/301975/references#references

 

I posted another thread on this from my own ideas a while back. It creates the throughput to make a home computer a super computer. I don't understand why these don't exist. I'm pretty sure that mixed optical and electronics could be useful as they could be do calibration work and potentially get around other issues. Especially to start with. It would add a massive single core which would basically be good for this and other games like it. Or it could be an entire system that is handed off too from a basic system and then allows the base system to stay in super low power mode. It could be easily driven by a gpu or other chips. 

With microled coming out it should give the ability to massively boost existing computer power. And give the nature of optical it can be processed in a million way to utilize the throughput. Including specialized output from the optical. You can easily separate and drop down the data to send back to another component or similar. And various things with a driver being able to switch bitdepth/color or other things could be used to control data on top of physical paths for light. It should allow increasing attributes to be used in order to process information on top of any actual traditional gate logic being built. Part of one idea would be to use bit depth along different color lengths. This could change the speed of the wavelength to either change the timing to swap datas arrival time or to do similar to make it fold in data like a deck of cards potentially in many many ways. Optical should hypothetically bring in an era of basically unlimited software and hardware combinations and functionality. The limit will be actually thinking them all up.

Micro led driving at 500ns would be 1 petabyte of througput. 1ns would get you half of an exobyte of througput. This at 32 bit. It seems like we are already at the point this is better than pure electronic. I'm assuming it a matter of effort to design sufficient means to utilise and control it. Which we should have. I randomly think of enough things separately. If enough are made up you could combine things to control data however you want. There wouldn't be a lot of things needed to be developed to do this. Could a mirror with a synced microled light be used to manipulate the data at a node? Or does light not combine in that. Are there any ways reflections and angling can get it to effect the output? If it can be done even in the smallest way you can use it to manipulate data. IE, you have a 32bit or other reduced stream(or increased) and then have it reflect at a node with different light to effect the data.

You could paralleling and shift the data into colors(frequencies)/depth or whatever else and use this to recombine to process higher bit depth information. It can be increase or decreased to any size and utilized as many times as needed. If you can add a synced(or purposely unsynced) reprogrammable light at each node you can hypothetical process anything. You are basically folding and shifting data into itself in a way that get the results. This can be done to the point it acts like gates or in different manners with more predictive simple software with the data preconfigured sufficiently. All can be used at once.

They were looking at this back in 1989. This has to be more feasible now. If not it should close. It has to be at least useful for enhacing existing suptercomputers or networks.

Also, to deal with output data and size of software you could easily use logic to compress data with logic similar to encryption. with the speed of optical computers you could unpackage or package large streams of data into small packets and live unpackage on either end or store. Let alone if you aren't worried about live data and have time to allow it to run longer to increase the data size or run the software in chunks. You could literally waiste nanoseconds of throughput to sequence the software in a way it could be controlled with light frequencies and have it run in many layers in order to get it to run and control the data output. I wonder if this would help with physics. If you don't need the calculation every second you could layer like you do with OS and other computer layers and simply have it arrive on time. Frequencies or color shifts could be time with hardware that can make it change layers or arrive between layers and be used in different ways by read sensors or other utilizations.

I'm assuming the current issues are either cost or security. I don't think raw ability to process is the limit atm.

One easier method of getting data to components is to shrink down the color bit depth going to a smaller component. This can be done in many ways. Including ignoring other data and having data in the needed spot and treating a higher stream as a lower one. You might even be ableit o have variable data. And anything within x-y depth on the higher bit stream would still be treated as x in a lower bit. This could also be done in many way. An alternative is to actually separate the data into a smaller stream and send it to the component. Optical should open up computers to basically infinite combinations. Unless you build one that does all of it. Then you just need software to choose what to do. I'm pretty sure all methods can be combined to boot.

You can also use things like heat and calibration methods to help or add new ways to control the data flow. And any bad processing can be used. You don't need accuracy as you could use a subsystem or other things to shift and make it automatically do what it should at any given node. Or you can purposely do this to process in various ways. This in itself could be a massively power way to do this to do similar to gates or other higher bit depth changes. Another is to do complex geometric processing. If you know what it should be and shift it you can do real live geometry. Say you have a cube design with read sensors representing real positions. It could even use ambient light(assuming you can predict and control the proper outcome.) and then shift to the correct data for a circle. The hardware could be put in actual geometric positions to get real data. And increasingly complex patterns could be layed over each other. the shifting could also allow improper positions so you can get read/write heads that should share a space both working. This could also allow other shapes that are inaccurate to be used to layer processing more efficiently potentially. Assuming you even need to do that.

At minimum to make such a reflector you could use the equivalent of a triple diode with two inputs. Or some way to get lossless read head and redo the image with a new microled. Basically what they do with electric signals. redistribute it. Not sure if that is useful though. Could you combine frequencies if accurate enough and layer them and use that. If a read head can deal with the logic at the end or you know what the logic should be it might not matter. In fact having both would give more ways to process information. combined and non combined light could add more data per moment to utilize making even more throughput. which can then be used or sacrificed during use. Parallelizing things gives infinite ways to separate and combine data. And bit depth can be near infinitely increased or decreased and used or sacrificed for the sake of various tasks.

 

Yea, I now this is a bus. This would be processing on a bus! Anything else can be added into it giving far more processing overall and in much more complex/robust combinations.

 

And if quantum ever comes out and can send data faster across space with no physical connection this could expand it more. Assuming it sends data faster. There was that method where they are supposed to sink up over a distance live. Assuming that has no lag. that could lead to ways to process geometry and other things that is impossible now. Maybe literally do what are currently impossible calculations.

 

Maybe this concept could be used:

The simplest application would be filtration methods. This could be combined with various recombination methods. This could do something like parelelize then use the led method to modify on single sections of data and then recombine with programmability. You could also try to make the filters programmable. That could also take the place of using synced up microled with reflectors. Assuming that even works. Which if you can recombine light it would have too in essence. Although if a read head method could be used it could be combined if it doesn't sacrifice anything.

Could this game get full physics with such throughput?

It could also get rid of the need for expensive video cards. Unless you want one to drive the microled programmable nodes.

You could also use the color depth in a single color or across colors. This could be used to different effects as the data might arrive a split second later. This could then read and sequence and retransmit. Assuming that saves time and the subsystem is fast enough or another method used. You could also scale color depth into a single color or spectrum to change various aspects of the data. This could pair with distance for static arrays or whatever other trick could be applied.

Alot could be done with data structures in the bit. and it could be adjust in many ways to control or change data. Especially if you know your data well enough.

Edited by Arugela
Link to comment
Share on other sites

First of all, what gave you the idea home computers are 32bit? Get a real computer. :) Today, 4GB RAM is only enough for office applications, and even then, you better not use a Chrome-derived browser, because it'll happily hog more than that if you have a few tabs with "modern" webpages open. In fact, it's barely enough for a smartphone. 

As for the rest, optronics are a thing, but they won't let you do miracles. At most, they might let you cram a supercomputer inside a PC-sized box. Power, cooling and price will still be supercomputer grade. Of course, eventually performance will hit the physical limits of electronics, and designers might turn to optronics then, because no matter what you do, there'll be gamers who want more performance, and bad programmers who would gladly use the increased performance margins to cover their sloppy coding. That's quite far off, though.

Link to comment
Share on other sites

14 hours ago, Arugela said:

https://ieeexplore.ieee.org/document/301975/references#references

 

Micro led driving at 500ns would be 1 petabyte of througput. 1ns would get you half of an exobyte of througput. This at 32 bit. It seems like we are already at the point this is better than pure electronic. I'm assuming it a matter of effort to design sufficient means to utilise and control it. Which we should have. I randomly think of enough things separately. If enough are

1ns switching = 1GHz.  This has been consumer electronics (on chip) grade since 2000ish and behind the times recently.  But you might say, that is only 1 bit!  Your idea uses 32 bits.  Except that 32 bits require 1.8 * 1015 dB signal to noise ratio, so that isn't happening.  Sure, you might be able to multiplex this multiple devices with different frequencies (hopefully you can filter out all the overlap at optical frequencies), but remember that the 1GHz intra-chip transmitter is only a few transistors out of billions available on chip, each optical transmitter + combination mirros/lenses will be much bigger.  As far as filtering goes, that is one of the first things GPUs were programmed to do (specifically FFTs, but filtering is probably the most important application of FFTs).  Nowadays you can easily use all teraflops (yes, *teraflops*) to do filtering on your GPU, so I'm guessing the market is long since saturated, unless you can fit a small optical device on a SOC.

Lucky for optical computing researchers, Moore's law seems to be slipping to barely a guideline, so it is possible that a breakthrough in optical computing won't be obsoleted by digital electronics before it even works in the lab.  But the bar is still high.

Intel was promising using optical communications on the motherboard something like a decade ago, but nothing came of it.  Now they are promising (since at least 2016) to put multiple little chips on a cheaper large chip for communication.  This would be especially helpful for optical chips as they are unlikely to work with the same fabrication process as standard CMOS.

Link to comment
Share on other sites

6 hours ago, Dragon01 said:

First of all, what gave you the idea home computers are 32bit? Get a real computer. :) Today, 4GB RAM is only enough for office applications, and even then, you better not use a Chrome-derived browser, because it'll happily hog more than that if you have a few tabs with "modern" webpages open. In fact, it's barely enough for a smartphone. 

As for the rest, optronics are a thing, but they won't let you do miracles. At most, they might let you cram a supercomputer inside a PC-sized box. Power, cooling and price will still be supercomputer grade. Of course, eventually performance will hit the physical limits of electronics, and designers might turn to optronics then, because no matter what you do, there'll be gamers who want more performance, and bad programmers who would gladly use the increased performance margins to cover their sloppy coding. That's quite far off, though.

64 bit is about a lot more than a fatter address bus.

Link to comment
Share on other sites

Actually I was wondering if you could use things like overlap on purpose as a processing method. Why avoid it when you can use a subsystem to help utilize it(or really good alignment.). You could take expected results and modify it to whatever you want. Unless I'm missing something(which is easy). I was assuming you could employ asymmetric means to get data or processing you couldn't otherwise. Optical may lead to an increased means of processing information if you don't restrict yourself to just the normal logic we use in electronic and treat optical as a new beast. Why stick with simple point to point processing when optical allows more?

I'm assuming radically new things can be done at this point.

Not sure what signal to noise ratio is for light. Have to look that up. I thought light avoided such things all together. Although with my base idea I'm intentionally taking all such things and trying to use it to process more information for much more robust systems. Take heat and other attributes also. Then allow software to decide when needed. It can be used for observation or within various means to process data.

And if it's relevant, can't light be run through fiber optics or isolated with physical medium to control saturation at various points. Especially if you want to use it like I just mentioned. If you can predict the results of the use of different medium you could even use that to process information. Anything sufficiently predicted is a method of processing information.

I imagine with the ability to read or separate light physically there might be a lot of methods to do this. With a subsystem I assume there is a little time to read and retransmit on a separate node to separate or other things to utilize the data in other means along the path(s). There should be a lot of very useful ways to utilize all aspects of light.

I'm also assuming a massive potential combination of different sized devices. Use different things in different ways. larger things can be good for changing out later for instance. Smaller things are good for space and material cost issues potentially. And large can be good for early adoption or other utilizations of space in the opposite direction depending on it's functionality.

Plus combined medium like light/electronic could be overlapped on purpose to get rid of normal weaknesses where needed. Assuming it's not being used. I would assume some of this could be done to electronics too. Don't know at what cost though. I'm assuming optical could get more out of it though. And that it may particularly be useful to employ these methods for it.

If sufficient means are produced you can process at the node along buses and within dedicated ships. In fact they could take on whole new purposes as chips could be considered much weaker. Or a means to manipulate the bus data. Methods could be even left up to software with sufficient versatility. Which I'm assuming exists in abundance. And should be developed if possible. That should help make up for cost as the bus is a giant chip. A possibly modifiable chip with ships inside it.

8 hours ago, Dragon01 said:

First of all, what gave you the idea home computers are 32bit? Get a real computer. :) Today, 4GB RAM is only enough for office applications, and even then, you better not use a Chrome-derived browser, because it'll happily hog more than that if you have a few tabs with "modern" webpages open. In fact, it's barely enough for a smartphone. 

As for the rest, optronics are a thing, but they won't let you do miracles. At most, they might let you cram a supercomputer inside a PC-sized box. Power, cooling and price will still be supercomputer grade. Of course, eventually performance will hit the physical limits of electronics, and designers might turn to optronics then, because no matter what you do, there'll be gamers who want more performance, and bad programmers who would gladly use the increased performance margins to cover their sloppy coding. That's quite far off, though.

I went with 32 bit because of the ability for monitor technology to produce it easily. If you can get more bandwidth then yes that would be fantastic. Optical hypothetically could do this by the driver being more powerful or by splitting the data or other things. Or I'm assuming. I'm assuming also that this for initial users might be worth the extra cost. Particularly if you could increase current servers or supercomputers massively. Unless it would already be cheap enough for home computers. I would kind of hope the diverse means I mentioned to process would allow cheaper means to get home versions earlier as the increased adaptability of optical may allow earlier cheaper methods with sufficient overlap in design. Plus a wider more interesting means of new adoption on all sides. If you can just switch out nodes for instance it's a lot better than buying whole new boards. So, it may come with new versatility in upgrading. Or even changing out the hardware per software run if you want in the extreme if desired. Especially when you hit limits in various direction like frequency and space of the optical device to get certain results. Although more means should exist to make alternative too because of all of this.

And since were assuming 32bit at a varied nanosecond response time you are getting more than 32 bit from a normal computer. Although that could allow it to be used on older computers. The same with smaller bit ranges. With sufficient design changes and parallelizing or other methods this could be applied to much older computers for fun. the same end results could even be achieved for the most part depending on where and how you offload the processing.

The bit depth would only matter if you are doing a very direct simulation of a current computer somehow with optical.

 

Also, fun note: Board designs could change also. You wouldn't need static boards. You coudl change multiple aspects of boards and use them all in varied intermixed ways. You could have solid board with permanent parts, solid boards with adjustable replacable parts. You could have partial boards that stick together and are stiffened to allow interconnecting for various reasons like physical distance for optical or similar for electronics. You could have flat boards or even use the interconnects to make them bend on purpose into shapes. You could then also have variations like lego blocks of various sizes and shapes making 3d patterns etc. Both 3d boards and 2d boards could be used to make 3d processing shapes as you could use nodes with different height optics or have the board have the light paths to varied effects.

And as you develop node technology you can take every single tiny quality and use it to process. You don't have to be stuck to one thing at a time. Unless you have to be. If you are stuck to one at a time you can inter switch them with software if it's useful for different applications at the different times or split them do to at the same time. Depends on how much redundancy you want per node.

And optical can utilize space more than electronic potentially as you can use accuracy and distance to compute potentially. If you can swap the data between colors/frequencies while retaining the data you can easily use space for both accurate methods or inaccurate methods to deal with physical space constraints/computes or other issues.. Many other issues as it can be used for compute itself. Basically everything an be used to compute. No matter how stupid or how small or insignificant. And the more it's combined with the more useful any one thing can become. The combination of the potential of accurate and inaccurate data can also have wide applications. Very wide applications! 8d The difficulty should be figuring out all the methods of compute and then utilizing them. One of those is using calibration for cheaper parts to get starter boards for home use. IE cost!

You could also start to radically start incorporating other means like audio processing or anything else every conceived. It doesn't have to keep up to be useful. There could be uses. So, we could get back to far more than just optical and electronic.

Edited by Arugela
Link to comment
Share on other sites

53 minutes ago, Nuke said:

64 bit is about a lot more than a fatter address bus.

Yes, but 32bit works just fine in most other aspects. I think the fatter address bus was the primary driver behind most consumer hardware switching to 64bit. Likewise, games become 64bit only so they can demand 8 or, as of lately, 16GB of ram instead of putting thought into having decent memory management.

Link to comment
Share on other sites

20 minutes ago, Dragon01 said:

Yes, but 32bit works just fine in most other aspects. I think the fatter address bus was the primary driver behind most consumer hardware switching to 64bit. Likewise, games become 64bit only so they can demand 8 or, as of lately, 16GB of ram instead of putting thought into having decent memory management.

you also have fatter operations. probably the most useful advantage is being able to feed the fpu with one load instead of two, and likewise retrieve the result in one instead of two. including the fpu instruction itself this represents a 40% performance boost when using doubles. then when you take vector extensions (such as avx) into account which are very wide operations requiring a lot of loads and stores to work with, you nearly double the performance. not to mention doubling the data bus which doubles memory performance.

should also point out that there are instances where 32 bit machines can have more than 4gb through memory banking techniques. though this was usually in the server space (this usually had per-application memory limits). 

Link to comment
Share on other sites

Can you saturate a lens from the side and change aspects of a light going through it? Or anything else like heat. Any thing that can effect the light in any way can be used. Even changing the heat of the device the light goes through could be used as it could send a signal to something as calibration or purposeful data change.

Here's a funny idea. You could literally throw a rock in your computer. You hit it with light and other things. You read and calibrate from many angles. You use that to run various things through to get end results with reads. This could be used for many thing. One being fun things like DnD. Specialized toys for table top. although that would be one of a million applications. Rocks, btw, are heavily laden with materials. So, instead of having a specialized device you can pick something up from your yard and check it for usefulness(easy when attached with some electronic subsystems or other sufficiently advaned optic or other computer methods). Different rocks different materials and results. It could literally be a switch for endless applications. Especially if you are using more than just light. Sound/vibration, temperature and many other things could be used. It's just a matter of what needs to be done to get to sync with the optics level. which can be done with many different read/write nodes to the rock. You could paralelize them to sync sufficient data. Or with multiple rock to fill in the blanks depending on use. There are endless ways to do this. Optics is the return of a lot of potential compute methods as it's very physical in nature. Ironically as it's presumed not to be physical.

Edited by Arugela
Link to comment
Share on other sites

This thread is quite old. Please consider starting a new thread rather than reviving this one.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...