[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[oc] Beyond Transmeta...
I'm not sure if this is the right place to talk about this, or if this idea
already exists (or if Ditzel even has thought it up yet), and I am not an
electronic engineer (a software engineer/artist/philosopher I suppose) and I
am not rich excentric man looking to blow his money on his excentric idea,
but I am a poor young adult looking to vent ideas that haunt me that I fear I
will never know the answer to if I do not ask, so I am interested in
discussing creating a processor that goes the same direction as Transmeta but
to its ends (I suppose). So all I offer is ideas, if you can take it and run
with it please do, if you see something wrong with it, please comment, if you
want to vent anger on me for some reason please do not for some day you may
be in a humble position.
One of the things that Transmeta and RISC are doing is reducing instructions.
This reduces processor complexity, but of course increases software
complexity. I do not consider it a bad thing to increase complexity of
software because the future of software developement is going to be more
evolutionary/adaptive. One problem with the direction is instructions keep
growing as bit depth of a processor increases, so you have to have a diffrent
instruction for 64bit, 32bit, and 16bit operations. One way to eliminate this
is to change the idea around, have a 1bit processor, and have many 1bit
processors built into the same unit, so that 64bit is really 64 1bit
processors working in parallel.
You may be wondering how this would work (or if I have any idea what I am
talking about), but let me say, this processor would be used for processing a
binary network (as I call it), and as I calculate it, there is 16 possible
instructions. I think maybe it would be easier to say that the instruction I
am refering to is more of a gateway, or relationship between bits. Think
about this for a moment, say you have 2 bits that combine into 1 bit, well
that is 4 combinations for those 2 bits, and each of those combinations has 2
possible answers (zero or one), that is like have 4 bits which is 16, so that
is 16 instructions...
4 combinations :
0~0
0~1
1~0
1~1
2 possible answers per combination:
0~0 = 0
0~0 = 1
0~1 = 0
0~1 = 1
...
That is a 4 bit instruction, which is 16 instructions all together.
One example is the bit `and' instruction. It would be 0001...
0~0 = 0
0~1 = 0
1~0 = 0
1~1 = 1
Another operation would be the bit `or' instruction. It would be 0111...
0~0 = 0
0~1 = 1
1~0 = 1
1~1 = 1
The bit `add' instruction would be 0110, the reason this is so is because of
this...
0~0 = 0
0~1 = 1
1~0 = 1
1~1 = 0
(you can imagine the ~ as a + for this one)
Using these instructions you can chain `add' `and' and `or' instructions in
a network to create multibit addition (3bit, 4bit, 5bit, 6bit... 32bit... n
bit) . For the initial part you would have one part that `add's the first 2
bits, and another that `and's those same 2 bits to get their carry, when you
get through it you will then need to get the carry from the addition of 3
bits, by first `or' the first 2 of the 3 bits and then `and' the 3rd bit to
the result of the `or'. If you lay this out as a network it would probably
look like a diagram for a processor, except it would be a software network
that can be rearranged to do diffrent things or emulate diffrent processors.
You can probably figure out what the other instructions represent... `zero'
0000, `one' 1111, `bit1' 0101, `bit2' 0011, etc...
You may still look at this as being a normal processor, but then you have
memory issues (1 bit registers can't access 4 GB of RAM), and instruction
size issues (despite only being 4bits in size, it still takes a lot of
instructions to do a normal cisc operation), and extra instructions for
branching, and performance issues. How I see it is beyond what a normal
processor does and because of the processors size bandwidth issues may be
reduced by placing the processor closer or even possibly inside the memory.
How I see it, is it working as an entity in a network, that executes
relationships as it needs to, stepping through the network. Maybe a kind of
queue is setup that sends instructions to the 1bit processor(s), when a bits
value is changed dependant relationships are sent to the queue to be
executed, so as something changes it causes a chain reaction of other
instructions to be executed and then each processor will each grab one from
the queue to execute with in a row (there would have to be rows to prevent
potential conflicts). I'm not really sure how this would be setup (I'm not
much of a electronic engineer after all, I normally see things in software),
you would need a sort of database of instructions, which in effect are like
calling each other in a reaction sort of manner. This reaction would also
occure when a input devices value is changed, like the mouse for example.
When the mouses X value changes (mapped into memory), there is then a chain
reaction from the bits connected to those bits, which will more then likely
end up by effecting a group of bits representing a place on the screen known
as the cursor :).
In the beginning of this type of setup, an X86 and RISC binary networks would
exist to run Operating systems and software from other platforms, but
eventually applications will start incorperating these processing networks
into them selves, or a special network that works similar to a compiler will
be created that will create/extend a network from an application, this
application will then be incorperated into the binary network and from there
it can evolve to the users preferences. This happens because the "compiler"
is a self modifying part of the network, which could be given the ability to
reproduce a better self, or cause self improvement, this self improvements
will be for higher inteligence in reproducing itself, this self modifying
part almost comes alive, and will eventually have the ability to combine 2 or
more applications to see what the users like and the best combinations will
be the evolution of better software.
I have been drawing diagrams on how to create a network for adding 2 4bit
values, and its not so bad, and there is a pattern so its easy to see how to
add more additional bits, as soon as I start playing with dia, I may try to
create them inside of their to show what it looks like.
I would like to hear what people here think, I would like to hear that
someone thinks they can implement it and try it out for an open core project,
but I wouldn't mind criticism for the idea either (wake up call to a sleepy
programmer), as long as it explains why. Maybe I am going nutz, here I am
talking to people in a totally diffrent feild from which I am used to, hoping
you will understand what I want as a software developer, or maybe software
engineering has given me to much freedom to think about things in strange
ways, so I am disconnected from the reality of what makes a good processor.
Lately I have been a philosophical mind warp, bothering and confusing the
hell out of people, so if all else fails just call me crazy.
Leyland Needham