How the mighty have fallen.

NVIDIA is killing the GTX260, GTX275, and GTX285 with the GTX295 almost assured to follow as it (Nvidia: NVDA) abandons the high and mid range graphics card market. Due to a massive series of engineering failures, nearly all of the company’s product line is financially under water, and mismanagement seems to be killing the company.

Not even an hour after we laid out the financial woes surrounding the Nvidia GTX275 and GTX260, word reached us that they are dead. Normally, this would be an update to the original article, but this news has enough dire implications that it needs its own story. Nvidia is in desperate shape, whoop-ass has turned to ash, and the wagons can’t be circled any tighter.

Word from sources deep in the bowels of 2701 San Tomas Expressway tell us that the OEMs have been notified that the GTX285 is EOL’d, the GTX260 is EOL in November or December depending on a few extraneous issues, and the GTX275 will be EOL’d within 2 weeks. I would expect this to happen around the time ATI launches its Juniper based boards, so before October 22.

Here’s info on their problems with Intel. And when they wanted to show off a card at a conference, they had to fake it.

Found by Brother Uncle Don.




  1. JoaoPT says:

    #amodedoma
    Linux will not be a selling point in the near future. It simpliy lacks market share…Sure, some enthusiasts might jump in if they saw a benefit, but enthusiasts don’t support a giant like nVidia. They are cultivated in order to give a aura of coolness to the product, but in the end it’s the bulk markets that drive the company’s sales. And that’s intel’s market. Ever browsed DELL’s page? try to configure some machines. 90% of what they sell is “office” PCs and those carry embedded intel graphics.
    nVidia is desperately seeking the next paradigm in computing. That’s why they have Tegra for mobile and handhels markets, that’s why they have chip sets. That’s why they pursue the GPU-CPU with Fermi.

  2. ECA says:

    29,
    ANd forcing Game off the computer onto the CONSOLE.
    Then you have a RE-BUY plan every 3-5 years of $300-1000 for a NEW console. ADD in NET access and GAME cost and you have 1 hell of a market to make money.

    31,
    also part of the problem was in the last few years…how many Video cards have been released? TONS. Advances in the cards?? FEW.
    In the last 10 years what have they done? MORE ram, MORE power, and OPTIONS that should have been there already. NO real advancement. So what you can render 1 billion polygons in 1 second. MOST of it still has to go thru the CPU. The video card should TAKE the extra processing away from the CPU NOT add to it. Windows is busy ENOUGH, then you add in all the graphics processing and you get SYSTEM LAG.

    I would love to see the DESKTOP rendered by the video card, NOT by windows.
    GAME cache on the CARD, not my hard drive or ram..
    Throw a Video/movie at the video card and it LINKS directly with the audio card and PLAYS the movie…NOT using the CPU/windows to decompress it from DIVX/AVI/MOV/FLV/VOB/…/… and the CPU trying to sync the audio to video.. that ISNT its job.

  3. ECA says:

    If you could just THROW the media at the Video and audio card and those cards would DO THE JOB, you would save the CPU/windows 80-90% of processing and gain that in POWER to do other things. Intel TRIED to get around this by adding EXTRA cores, but windows ISNT handling it properly..Win7 comes up, and soon (2 year) win8.
    LET that 256meg PLUS video card do some work.

  4. Wretched Gnu says:

    Er… are you people suggesting that the Fermi prototype was NOT an obvious, demonstrated fake? With a board that was obviously cut in *half* atop mis-aligned pins…? I suggest you read the article and look at the pictures.

    Why on earth would nvidia have to resort to such a desperate and pathetic measure if everything were okey-dokey..?

  5. ECA says:

    also FRAG..
    The 386 WAS 32 bit..but win 3.1 wasnt.
    and from win 2000 you could have a 64bit system..
    And I think the P3 started the hardware at 64 bit.. IF you had the hardware..
    SOooo..64 bit isnt anything..

    ANd as a kick in the shins..BEFORE 1985 the AMIGA was a 16/32 bit system.

  6. joaoPT says:

    #34
    Fermi might not be into the thing in Jen-Hsun’s hand, but Fermi is real.
    And ATI (sorry, can’t seem to put myself into calling them AMD…) is not sitting in it’s hands either…They are both (and arguably intel too with Larrabee) moving to the GPU-CPU scenario…You could also make the point IBM is too doing the same with the Cell, only the SPEs in the Cell are not graphic processors. Nonetheless, massive parallel computing has come to stay.


0

Bad Behavior has blocked 7154 access attempts in the last 7 days.