![]() |
If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. |
|
|
Thread Tools | Display Modes |
#1
|
|||
|
|||
![]()
Intel has announced that they will stop making replaceable CPU's after
Haswell. From now on, all CPU's are supposed to be in BGA packaging, which means you can only attach CPU's to the motherboard by soldiering them on. You won't be seeing these in any home DIY's toolkit, so it's the end of the road for that upgrade mechanism. I've been upgrading my original system since 1987, and right now there's no original pieces remaining on it, but I can trace each of the pieces back in a chain to the original 8088 PC-XT clone that I had bought back then. I suppose it was meant to happen, not many people build their own PC's anymore, and it's been cheaper to buy a full brand-new system for many years now rather than upgrading it. Although this is just an Intel announcement, and AMD hasn't said it would do the same thing, but I don't see AMD not following suit with this, it'll help their financial situation too, and probably help them even more. I suppose you could keep upgrading if you buy a full new motherboard alongside your CPU, you'd probably have to buy it with new memory also. Yousuf Khan Intel’s Haswell Could Be Last Interchangeable Desktop Microprocessors - Report - X-bit labs http://www.xbitlabs.com/news/cpu/dis...rs_Report.html |
#2
|
|||
|
|||
![]()
Yousuf Khan wrote:
Intel has announced that they will stop making replaceable CPU's after Haswell. From now on, all CPU's are supposed to be in BGA packaging, which means you can only attach CPU's to the motherboard by soldiering them on. You won't be seeing these in any home DIY's toolkit, so it's the end of the road for that upgrade mechanism. I've been upgrading my original system since 1987, and right now there's no original pieces remaining on it, but I can trace each of the pieces back in a chain to the original 8088 PC-XT clone that I had bought back then. I suppose it was meant to happen, not many people build their own PC's anymore, and it's been cheaper to buy a full brand-new system for many years now rather than upgrading it. Although this is just an Intel announcement, and AMD hasn't said it would do the same thing, but I don't see AMD not following suit with this, it'll help their financial situation too, and probably help them even more. I suppose you could keep upgrading if you buy a full new motherboard alongside your CPU, you'd probably have to buy it with new memory also. Yousuf Khan Intel’s Haswell Could Be Last Interchangeable Desktop Microprocessors - Report - X-bit labs http://www.xbitlabs.com/news/cpu/dis...rs_Report.html There's always a solution. Remember that Foxconn makes their own sockets for motherboards, and they also make motherboards. The motherboard industry could cook up a flexible solution all on their own. There are a ton of cheesy adapters out there. Lots of opportunities for someone to cook up a solution. All that's needed is sufficient lead time to do the engineering and make a reliable solution. http://www.primedistributing.com/Pro...PA-BGA-SMT.jpg And if Intel makes tested silicon die available as a purchase option, someone can package them at an MCM factory. And put any kind of lead or contact on it, that you want. This is just an opportunity for someone - a middle man - to make some cash. Paul |
#3
|
|||
|
|||
![]()
Apple was/is like that, limited options in changing out hardware. If Intel
completely removes the DIY aspect of a PC then they are handing business over to Apple. Also, a lot of third party vendors will probably close shop. Fixed hardware + a Bing OS (aka Windows 8) = a fast declining pc industry. |
#4
|
|||
|
|||
![]()
The motherboard industry could cook up a flexible solution
all on their own. .... but MB manufacturers usually work from reference hardware put out by Intel/AMD/etc. Would they be willing to produce items not covered in the reference? What does BGA really mean, one buys the MB/CPU as one item? Or does the whole DIY concept die and the Newegg 'Computer Hardware' section disappear? |
#5
|
|||
|
|||
![]()
"Yousuf Khan" wrote:
Intel has announced that they will stop making replaceable CPU's after Haswell. From now on, all CPU's are supposed to be in BGA packaging, which means you can only attach CPU's to the motherboard by soldiering them on. You won't be seeing these in any home DIY's toolkit, so it's the end of the road for that upgrade mechanism. http://www.youtube.com/watch?v=KjKEmKUatJ4 and a whole bunch more at http://www.youtube.com/results?searc...=bga+soldering What, you mean you don't have a heat gun in your electronic toolbox along with the soldering iron, or hot air station sitting on the shelf? The spouse will get ****ed if you don't cleanup the pancake griddle after using it to remove and resolder the BGA parts. You must have soldering wick, though. Just means you'll have to put those old-school soldering techniques in your backstore memory and learn how to desolder and solder BGA parts. You're just spoiled by sockets that made it possible for home users with no or destructive soldering skills to add components to a mobo. Maybe the parts vendors are getting tired of the returns by boobs that don't employ anti-static measures, overclock, overheat, or otherwise destroy good parts. Soldering on the CPU, chipset, memory, and other components would certainly up the reliability of the assembly while reducing returns from ignorant, lazy, or sloppy users. Intel¢s Haswell Could Be Last Interchangeable Desktop Microprocessors - Report - X-bit labs http://www.xbitlabs.com/news/cpu/dis...rs_Report.html That wouldn't prevent first-time soldering of the CPU onto the BGA grid. The mobo maker could just make a plastic frame to hold the chip in place (both for position along with affixing to the mobo via spring clip) and the user would use a soldering iron with a tip designed for the BGA grid pattern. The user would buy the mobo they want, the CPU they want, and then do a one-time solder of the CPU onto the mobo. After all, after you buy the mobo and CPU and put them together, how often have you actually replaced the CPU? Yeah, if the CPU goes bad then you have to replace it but have you had to do so? When the CPU gets too old, underpowered, or lacking in firmware features, do you really replace just the CPU or do you replace the CPU, mobo, memory, and the whole smash to upgrade to newer hardware? Also, you can already buy mobo+CPU combos from online vendors. Most times they pre-install the CPU so all you have to do is attached the heatsink+fan (and sometimes you don't have to do that if you stay with the stock HSF for the CPU). So instead of them sliding the CPU into the ZIF socket for you, they'll have an inventory of pre-soldered combinations and you pick one to buy. |
#6
|
|||
|
|||
![]()
In article , "geoff" wrote:
Apple was/is like that, limited options in changing out hardware. If Intel completely removes the DIY aspect of a PC then they are handing business over to Apple. Also, a lot of third party vendors will probably close shop. Fixed hardware + a Bing OS (aka Windows 8) = a fast declining pc industry. Sounds like the 1990's Atari ST, AMIGA all over again. |
#7
|
|||
|
|||
![]()
On Sun, 25 Nov 2012, GMAN wrote:
In article , "geoff" wrote: Apple was/is like that, limited options in changing out hardware. If Intel completely removes the DIY aspect of a PC then they are handing business over to Apple. Also, a lot of third party vendors will probably close shop. Fixed hardware + a Bing OS (aka Windows 8) = a fast declining pc industry. Sounds like the 1990's Atari ST, AMIGA all over again. Really, every computer. Sure you could buy an S100 bus systemin the early days, but there was limited ability to upgrade despite all the boards plugging into a motherboard that only had sockets. It was easy to move to the Z80 from the 8080. But the bus was very much related to the 8080, so "foreign" CPUs took a lot of adapting. Even the front panel on the Altair was too specific to the 8080 to be useful with another CPU. The standardization was often because of CP/M the operating system, since it was written to keep the I/O in a small section, one could fairly eaily adapt it to other hardware (as long as it used the 8080). So the real upgrade path was the 16bit CPU, preferably the 8088 or 8086. But then there were other issues besides differring buss signals, such as lack of address lines for more RAM. There were various schemes to deal with that, but it took time before standardization set in, and then it was mostly too late. When MITS came out with a 6800 based computer in the fall of 1975, they put a different bus on it, and when SWTP put out their computer (which was far more successful 6800 system than the MITS 6800 system) it used a different bus (though that bus tended to be used by other 6800 based computers). The DIgital Group that was more like a hobby trying to turn into a commercial product, it used its own bus which made it easy to have different CPU boards, but they never went further than the Z80 and maybe the 6502. The Apple II wsa fairly flexible, so one could get Z80 cards for it, then later 6809 cards, and at some point 68000 cards. But they were workarounds and usually the 6502 did the I/O. Let's not forget that the original IBM PC was no different from that Amiga or Atari. ALl three had CPUs in sockets, but there was no plug in replacement that made things faster. You could workaround that, but it would need a whole board. And you'd be stuck with the existing clock frequency unless you had complicated timing methods (to run the CPU faster but the bus at its regular rate). It was only with time that the "IBM PC" became more flexible. And that was more a crossover between the CPUs and the motherboard manufacturers. So you could put in a faster CPU, but that's because the motherboard company anticipated faster speeds and put in jumpers. That meant the CPU companies had to keep the other companies informed of where they were going. In the 386 era there was some level of variability, so you could get a cheaper one that had no math coprocessor built in (and oddly then find a math coprocessor to add later). It was really in much more recent times that a motherboard had some hope to be useable over time, and that was because the CPUs generally stopped changing that much, the speed being the key factor. If the motherboard anticipated upgrades, and the CPU kept the same package and other features, then you could use the motehrboard for a few years. Usually a new motherboard was needed if the databus bumped up in size, the exception being eventually with the 32-64 upgrade. Otherwise, it would be no different from the Amiga or Atari, except by that point nobody was making CPUs to plug into the expansion bus (I once found an 80286 card that did that), so you had to replace the motehrboard. But then, the motherboard probably cost as much as one of those plug in upgrade boards in the past, but the new motherboard didn't have to compromise. The only good thing was the case was generally generic so the new motherboard fit (well so long as the area for connectors at the back matched up or could be replaced). Michael |
#8
|
|||
|
|||
![]()
On Sat, 24 Nov 2012 23:58:33 -0600, VanguardLH wrote:
[...] The mobo maker could just make a plastic frame to hold the chip in place (both for position along with affixing to the mobo via spring clip) and the user would use a soldering iron with a tip designed for the BGA grid pattern. The user would buy the mobo they want, the CPU they want, and then do a one-time solder of the CPU onto the mobo. [...] That's some funny stuff right there. Unless you're serious, of course... Cheers! |
#9
|
|||
|
|||
![]()
"daytripper" wrote:
On Sat, 24 Nov 2012 23:58:33 -0600, VanguardLH wrote: [...] The mobo maker could just make a plastic frame to hold the chip in place (both for position along with affixing to the mobo via spring clip) and the user would use a soldering iron with a tip designed for the BGA grid pattern. The user would buy the mobo they want, the CPU they want, and then do a one-time solder of the CPU onto the mobo. [...] That's some funny stuff right there. Unless you're serious, of course... Cheers! I was serious. You do know what "ball" means in BGA, right? It's a ball of solder. So why can't the chip, even a CPU, come prepped with the balls of solder on its pads, the mobo come with balls of solder on its grid and using feedthroughs so the solder is reached from the backside of the board, and all you have to do is keep the chip pressed against the grid, keep it aligned, heat up the solder gun with a matching grid tip, and just melt all the solder to weld the chip to the grid? You've never applied new solder to the underside of a PCB so it heats the solder on the other side through a feedthrough to use solder wick on the other side when you cannot otherwise reach the other side with a soldering iron? Heat travels. Of course, we're talking about DIY'ers that know how to solder and that it flows towards the heat source and what level of heat to apply and not the boobs that barely know how to push down the level for a ZIF socket. Not having sockets doesn't mean you can't DIY. It means the DIY'er will need better skills than pushing stuff into a socket or slot. |
#10
|
|||
|
|||
![]()
VanguardLH wrote:
"daytripper" wrote: On Sat, 24 Nov 2012 23:58:33 -0600, VanguardLH wrote: [...] The mobo maker could just make a plastic frame to hold the chip in place (both for position along with affixing to the mobo via spring clip) and the user would use a soldering iron with a tip designed for the BGA grid pattern. The user would buy the mobo they want, the CPU they want, and then do a one-time solder of the CPU onto the mobo. [...] That's some funny stuff right there. Unless you're serious, of course... Cheers! I was serious. You do know what "ball" means in BGA, right? It's a ball of solder. So why can't the chip, even a CPU, come prepped with the balls of solder on its pads, the mobo come with balls of solder on its grid and using feedthroughs so the solder is reached from the backside of the board, and all you have to do is keep the chip pressed against the grid, keep it aligned, heat up the solder gun with a matching grid tip, and just melt all the solder to weld the chip to the grid? You've never applied new solder to the underside of a PCB so it heats the solder on the other side through a feedthrough to use solder wick on the other side when you cannot otherwise reach the other side with a soldering iron? Heat travels. Of course, we're talking about DIY'ers that know how to solder and that it flows towards the heat source and what level of heat to apply and not the boobs that barely know how to push down the level for a ZIF socket. Not having sockets doesn't mean you can't DIY. It means the DIY'er will need better skills than pushing stuff into a socket or slot. You at least want to solder all the balls at the same time. There is a magic alignment effect, where the wetted contacts tend to "pull" the chip into alignment, such that the chip rotates to the grid of contacts underneath. You want the solder to fill the pads properly, which is going to happen if all the balls melt at the same time and the chip settles into place. If you were a home user, and desperate for adventure, you could try a toaster oven. That's the closest thing to IR reflow you can arrange for real cheap. Some people used the toaster oven method, to fix Nvidia GPU solder joints. But I would still put this idea in the "repugnant" category. You have absolutely no control of the temperature profile that way, and the toaster oven is going to be heating all sorts of stuff you don't want heated (think "burned plastic"). At the factory, they use an XRay machine to verify BGA soldering. On a processor, 2/3rds of the balls could be VCC and GND, and they wouldn't be candidates for boundary scan verification. An XRay, can uncover balls damaged by the "popcorn" problem for example. And more than one XRay is taken. By holding the XRay machine on an angle, you can photograph the balls from either side. No home user would be able to verify the solder job was completed properly. You wouldn't want to burn some power connections, because too many VCC or GND pins were open circuit. http://upload.wikimedia.org/wikipedi...joint_xray.jpg (Voids caused by excessive moisture level on the solder balls.) http://glenbrook.webworksnow3.com/bl...3/bga_fig4.gif With care, I'm told you can get defectivity down to around 1 ball in 100,000. That means, if you solder down a hundred chips each having 1000 balls underneath, one of the chips will have a single bad solder joint. It would take a little effort and expense though, to get that good at it. The results of home users doing such soldering, isn't going to be that good. Paul |
Thread Tools | |
Display Modes | |
|
|
![]() |
||||
Thread | Thread Starter | Forum | Replies | Last Post |
Anyone playing games with a high-end video card on a low-end Athlon 64 X2 system? | Ant | AMD x86-64 Processors | 2 | February 1st 08 10:58 PM |