Restless Souls/Technology: Difference between revisions

m
no edit summary
m (Sytropin and food control)
mNo edit summary
Line 1,697: Line 1,697:
The companies will push forward '''multi-core processors''' and '''vertical stacking'''. For example, Samsung already uses vertical stacking in flash memory, by now you should have heard of their V-NAND SSD. The idea to use the third dimension should soon find its way into processor production to fulfill Moore's law.
The companies will push forward '''multi-core processors''' and '''vertical stacking'''. For example, Samsung already uses vertical stacking in flash memory, by now you should have heard of their V-NAND SSD. The idea to use the third dimension should soon find its way into processor production to fulfill Moore's law.


A computer is meant to be a universal problem solver. But special problems require special solutions. So, CPUs got company - most notably - by coprocessors for networking, sound (DSP), graphics (GPU) and physics (PPU). Those coprocessors were - and most still are - available as add in cards. There's a tendency that over time coprocessors are combined on one product for weaker customers.
A computer is meant to be a universal problem solver. But special problems require special solutions. So, CPUs got company - most notably - by coprocessors for networking, sound (DSP), graphics (GPU) and physics (PPU). The ''lowest common denominator'' in sense of performance gets soldered as chips on the mainboard. Like in organic evolution, after a phase of specialization, a phase of socialization follows. ^_^
 
* mainboards having "onboard" chips: networking, sound, graphics
* mainboards having "onboard" chips: networking, sound, graphics
* combined CPU and GPU: AMD's APU and Intel's CPUs with integrated HD Graphics
* combined CPU and GPU: AMD's APU and Intel's CPUs with integrated HD Graphics
* GPU with PPUs: GPU with CUDA (after Nvidia bought PhysX), general purpose GPU (GPGPU)
* GPU with further specialized parts: see CUDA and TPU


Like in organic evolution, after a phase of specialization, a phase of socialization follows. ^_^
Due to the different demand in processing power most "power house" chips - CPU and GPU - can either be exchanged directly or by the means of expansion cards.


Progress on the hardware level is only one side of the coin. You also need software to utilize it. A Russian team demonstrated that [http://phys.org/news/2016-06-scientists-pc-complex-problems-tens.html some scenarios requiring supercomputing can actually by handled on a desk PC] after installing a new graphics card [http://arxiv.org/pdf/1508.07441.pdf (GTX-670, CUDA supported) and self-written software.]
But progress on the hardware level is only one side of the coin. You also need software to utilize it. A Russian team demonstrated that [http://phys.org/news/2016-06-scientists-pc-complex-problems-tens.html some scenarios requiring supercomputing can actually by handled on a desk PC] after installing a new graphics card [http://arxiv.org/pdf/1508.07441.pdf (GTX-670, CUDA supported) and self-written software.]


As for 2018, still many PC programs can't draw advantage from multi-cores processors. To be fair, not every program need multiple cores - just as a simple text editor. Yeah, for what? However the trend is here to stay: more cores. See Intel's Core X and AMD's Threadripper 2.
Still many PC programs can't draw advantage from multi-cores processors and will not. That is because not every program need multiple cores, like text editors. However the trend is here to stay: more cores. (See Intel's Core X and AMD's Threadripper.) Or more exactly: parallelism. It's the new hype, again. Machine learning and AI just started to regain traction after there had been [wikipedia:AI_winter|AI winters]] before.
 
After these tricks (stacking, multi-core architecture, parallelism) has been fully exploited, the companies will be forced to turn to new technologies such found in optical computing, memristors, and spintronics.


After these tricks (stacking, multi-core architecture, parallelism) has been fully exploited, the companies will be forced to turn to new technologies such as spintronics in 2025.
Harnessing the magnetic property of electrons will not only result in less power consumption but also high clock rates. In the past years the clock speed stagnated between 4 and 5 GHz due to overheating. Processors based on spintronics won't heat up that fast, so they can operate at higher frequencies. And higher frequencies means more calculations per second.
Harnessing the magnetic property of electrons will not only result in less power consumption but also high clock rates. In the past years the clock speed stagnated between 4 and 5 GHz due to overheating. Processors based on spintronics won't heat up that fast, so they can operate at higher frequencies. And higher frequencies means more calculations per second.


Line 1,718: Line 1,720:
* Majorana particle (Topological QC)
* Majorana particle (Topological QC)


Each approach comes with it's own difficulties and most parts have to be constructed from scratch and the underlying logic rests on fragile particle states such as entanglement. While powerful quantum computer might just gain traction by 2050, we should see more and more hybridization of processor technologies in the meantime.
Each approach comes with it's own difficulties and most parts have to be constructed from scratch and the underlying logic rests on fragile particle states such as entanglement.
 
Powerful, universal quantum computer might just gain traction by 2050.
 
However, we should see more and more specialization and hybridization of processor types in the meantime. The demand for non-standard solutions can be seen by the existance of ASIC and FPGA. But as even more processor types will emerge they will require AI to be manage in highly diverse systems.


----
----
2017.02.02 Update: The first blueprint for a significant large QC has been published. The [https://phys.org/news/2017-02-blueprint-unveiled-large-scale-quantum.html article on phys.org] had 5000 shares after one day (counter was resetted), a number that is rarely reached.
2017.02.01 - The first blueprint for a significant large QC has been published. The [https://phys.org/news/2017-02-blueprint-unveiled-large-scale-quantum.html article on phys.org] had 5000 shares after one day (counter was resetted), a number that is rarely reached.


The foundation of this kind of QC are trapped ions. They require bulky vacuum chambers. That's not very elegant but due to economics' tendency to create cash cows the development of better approaches will be delayed. It can only be hoped that the ion QC will provide enough computing power to accelerate further research and therefor compensate the economic influence.
The foundation of this kind of QC are trapped ions. They require bulky vacuum chambers. That's not very elegant but due to economics' tendency to create cash cows the development of better approaches will be delayed. It can only be hoped that the ion QC will provide enough computing power to accelerate further research and therefor compensate the economic influence.


NV-centers should take over when the ion traps cannot be shrinked any further thus giving birth to real "large scale" QC.
NV-centers should take over when the ion traps cannot be shrinked any further thus giving birth to real "large scale" QC.
2019.10.23 - Google announced quantum supremacy by using superconducting circuits - at least for generating random numbers. So for what's worth: two out of four quantum processor technologies reached infancy and will compete for leadership.


----
----


As you already guessed these numbers (2025, 2050) are just rough estimates. All I can do here is explaining my estimates in some more detail.
The ultimate computing hardware will incorporate bio and nanotechnologies to make themselves shapeshifting hardware to adapt to the problems you give them.
 
Let's make also an approach via economy. A company will only invest a serious amount of money into a new R&D field if it doesn't have another option. If they try it anyway, there's a high risk to fail. Latest example is HP's the machine, a still-hypothetical computer driven by memristors. The technological gap is still too wide.
 
<!--"We have silicon wafers. Find something to continue with that." Spintronics...
 
[...]
 
Chance of niche products.
 
[...]
 
neurosynaptic chips
 
memristors --><!-- chemical change vs. spin  --><!--
 
artifial neurons
 
cybernetic neurons
 
[...]
 
 
Self-conscious AIs will seek to migrate to associative machines since they are still safe from viruses and spyware sheltering their "mind" from hostile humans. Also, that architectures supports patter recognition.
 
 
http://www.assoziativmaschine.de/index2.html
 
http://www.deutschlandfunk.de/ausspaehsicher-computer-arbeitet-wie-das-menschliche-gehirn.684.de.html?dram:article_id=274935
 
[...]


Shapeshifting hardware-->


===Consequences===
===Consequences===
8,464

edits