MSI N280GTX OC HydroGen And Super OC Video Cards Unveiled

Posted: Sep 3 2008, 1:41am CDT | by , Updated: Aug 11 2010, 10:23am CDT, in News | Hardware & Peripherals

/* Story Top Left 2010 300x250, created 7/15/10 */ google_ad_slot = "8340327155";

Buy This Now On Amazon

MSI announces the MSI N280GTX OC HydroGen video card with the highest clock setting on the market and micro channel water cooling solution.
MSI also tunes up the clock settings of the original fan cooling N280GTX series and makes MSI N280GTX-T2D1G Super OC the highest clocked GeForce GTX 200 series in the world.

The MSI N280GTX OC HydroGen series runs the world highest core clock setting at 700MHz, but has 10 Celsius lower temperature level than normal fan cooling GeForce GTX 200 series even at full workload.
MSI achieves this with only 1-slot thick HydroGen micro channel water cooling tank. This makes a multiple graphics setup in single system possible, even with all the water cooling accessories together.
With the highest core clock over-clocked, both MSI N280GTX OC HydroGen and N280GTX-T2D1G SuperOC enjoy a net 17% performance boost compared to standard GeForce GTX 200 series. Even with the previous MSI over-clocked N280GTX-T2D1G-OC against the Super OC or N280GTX OC HydroGen, the latter ones still have a 7% leading headroom in performance.

It is not known when these cool 280 GTX cards will ship.
Via the MSI site.

This story may contain affiliate links.


Find rare products online! Get the free Tracker App now.

Download the free Tracker app now to get in-stock alerts on Pomsies, Oculus Go, SNES Classic and more.

Latest News


The Author

<a href="/latest_stories/all/all/2" rel="author">Luigi Lugmayr</a>
Manfred "Luigi" Lugmayr () is the founding Chief Editor of I4U News and brings over 25 years experience in the technology field to the ever evolving and exciting world of gadgets, tech and online shopping. He started I4U News back in 2000 and evolved it into vibrant technology news and tech and toy shopping hub.
Luigi can be contacted directly at ml[@]




comments powered by Disqus