CPU & GPU On one silicon die

Lost in space
Banned
💻 Oldtimer
Joined
Dec 21, 2003
Messages
3,608
Best answers
0
http://theinquirer.net/default.aspx?article=33678
ENGINEERS from AMD and ATI companies are going to start working on a unified chip that will have GPU and CPU on a same silicon. We learned this from high ranking sources close to the companies, more than once.

Don?t get too excited as it will take at least eighteen months to see such a dream come true.

This is the ultimate OEM chip, as it will be the cheapest way to have the memory controller, chipset, graphics function and CPU on a single chip. This will be the ultimate integration as will decrease the cost of platform and will make even cheaper PCs possible.

CPUs are being shrunk to a 65 nanometre process as we speak and the graphics guys are expected to migrate to this process next year. The graphics firms are still playing with 80 nanometre but will ultimately go to 65 nanometre later next year.

DAAMIT engineers will be looking to shift to 65 nanometre if not even to 45 nanometre to make such a complex chip as a CPU/GPU possible.

We still don?t know whether they are going to put a CPU on a GPU or a GPU or a CPU but either way will give you the same product.
However, it leads me to question if such a product will use up room on the processor that would otherwise be used for other processor cores.
 
Live free or die by the sword
Retired Forum Staff
✔️ HL Verified
💻 Oldtimer
Joined
Dec 1, 2001
Messages
7,416
Best answers
0
Location
North East Pennsylvania
Well, think about what you just posted. The goal is to make the ultimate OEM chip, one package for everything system and video.

This is the ultimate OEM chip, as it will be the cheapest way to have the memory controller, chipset, graphics function and CPU on a single chip. This will be the ultimate integration as will decrease the cost of platform and will make even cheaper PCs possible.
You answered your own musings: they don't care about performance, only cost.

This will be the future of PCs in the workplace, or cheap emachines at Wal Mart.
 
New Member
💻 Oldtimer
Joined
Nov 14, 2003
Messages
1,659
Best answers
0
yep. i can see these being the budget machines of the future. however, because of the way they will work, hardcore gamers are still going to want, and need, custom rigs with dedicated graphics cards etc.
 
Lost in space
Banned
💻 Oldtimer
Joined
Dec 21, 2003
Messages
3,608
Best answers
0
Cucumba said:
Well, think about what you just posted. The goal is to make the ultimate OEM chip, one package for everything system and video.
I don't need to think about what I posted, since I'm already well aware. The pnbpard gfx market is always crap in comparision, however the gpu (if using a shared memory controller, or it's own to get to system memory), may actaully be some pretty interesting tech IMO for oem pcs that always love using onbaord piles of crap.
Mad_AxMan said:
yep. i can see these being the budget machines of the future. however, because of the way they will work, hardcore gamers are still going to want, and need, custom rigs with dedicated graphics cards etc.
Did I ever say anything to do with gamers with my commentso_o.
 
Live free or die by the sword
Retired Forum Staff
✔️ HL Verified
💻 Oldtimer
Joined
Dec 1, 2001
Messages
7,416
Best answers
0
Location
North East Pennsylvania
|Overlord| said:
I don't need to think about what I posted, since I'm already well aware.
If you don't want me to answer your questions, don't ask them in public.
 
New Member
Joined
Jan 7, 2003
Messages
203
Best answers
0
Intel allready sort of went down this road with the Timna Core.
But was cancelled and the team went off and deseigned the "Perfect" mobile processor. (Used the P6 Core, with the best stuff from the netburst architecture) which the Pentium M was born. (And when Intel gathered to base its future processors on).
the markets for these processors would probably end up being Small Home Theater PC's. Cutting the GPU core out of the system, and "Glueing" it to a processor should reduce power consumption considerably.
And Basic work computers for Word/Excel and such.
 
Lost in space
Banned
💻 Oldtimer
Joined
Dec 21, 2003
Messages
3,608
Best answers
0
Cucumba said:
If you don't want me to answer your questions, don't ask them in public.
I'm not. The thread's main purpose is more for disscussion purposes. I don't nessacairly need any questions answered at this stage.
Pemalite said:
Intel allready sort of went down this road with the Timna Core.
But was cancelled and the team went off and deseigned the "Perfect" mobile processor. (Used the P6 Core, with the best stuff from the netburst architecture) which the Pentium M was born. (And when Intel gathered to base its future processors on).
the markets for these processors would probably end up being Small Home Theater PC's. Cutting the GPU core out of the system, and "Glueing" it to a processor should reduce power consumption considerably.
And Basic work computers for Word/Excel and such.
I actually remember seeing intel demo this with a concept/prototype model that was being displayed. Even though ti was not released to the public, it just goes to show what is capable. Plus, power consumpotion reductions are always welcome =).
 
New Member
Joined
Jan 7, 2003
Messages
203
Best answers
0
Also... You read the Inquirer? o_O Use DailyTech over Anandtech.com Much more reliable in my opinion :p (Anyone see the Inquirer's fake pictures on the "Nvidia Quality crap" When it was clearly a photoshop job? lol.
I dunno.
 
Live free or die by the sword
Retired Forum Staff
✔️ HL Verified
💻 Oldtimer
Joined
Dec 1, 2001
Messages
7,416
Best answers
0
Location
North East Pennsylvania
Overlord said:
However, it leads me to question if such a product will use up room on the processor that would otherwise be used for other processor cores.
Then clean up your english, because this clearly shows that you questioned in public. At either rate, whether or not you queried is not the subject of this thread, get back on topic. You simply don't want input from me. You are simply being difficult for the sake of being difficult.
 

Users who are viewing this thread

Top Bottom