Quick News

Logitech released a TKL Mechanical Keyboard today:
http://news.logitech.com/press-release/logitech-g-unveils-tenkeyless-mechanical-gaming-keyboard

G410_FOB_Glow_transparent.jpg


And AMD has new embedded GPU's on the market

http://www.guru3d.com/news_story/amd_offers_amd_embedded_radeon_e8950mxm_module.html
4-1080.699999887.jpg
 
I still love AMD but after the thing taking out my PSU and leaking what smelled like half a bottle of vodka on me I just do not trust AIO's on GPU's anymore :p

Ive only ever had one AIO leak and that was just recently. After months (years) of abuse the original H100i sample leaked.

You cant treat AIO like women dude, you have to be gentle
 
Ive only ever had one AIO leak and that was just recently. After months (years) of abuse the original H100i sample leaked.

You cant treat AIO like women dude, you have to be gentle
Aye the H100i is the bonafide champion of the AIO world, though admittedly mine is starting to perish slightly around the barbs, it's a proper little soldier though.
 
Ive only ever had one AIO leak and that was just recently. After months (years) of abuse the original H100i sample leaked.

You cant treat AIO like women dude, you have to be gentle

Cheeky ^_^

I did install it carefully and it literally just sat in there nicely chugging along for 8+ weeks and then just yesterday decided to take a pee XD
 
AMD was rumored to have gone to TSMC because of the yield and defect issues GF was having on 14nm. However WCCF had an interview with Jason Gorss, Senior Manager, Corporate and Technology Communications at Globalfoundries which provided the details below.

[paraphrased quote] Globalfoundries 14nmLPP/LPE is ahead of schedule and already exceeding plans for yields and defect desnity. 14nmLPE variants are the early market version of the LPP for those wanting to adopt the node sooner. LPE is the lower power version of the design. The LPP is the 2nd gen of LPE and is even more power efficient as well as more powerful. The LPE variants are already undergoing volume production, meeting yield targets on lead customer products. LPP is set for qualification sometime this year with volume ramp beginning next year.

So the rumors for AMD going to TSMC because of issues are more than likely false as GF is ahead. AMD has gone to GF before for CPUs and APUs and typically go to TSMC for GPUs. I suspect Zen and any other new APUs will end up going to GF so they can release products sooner rather than later as they are desperately in need of revenue. It will be interesting to see if AMD goes to GF for GPUs as well since TSMC has had many issues with 16nm. I fully suspect this will be the first time AMD has gone to 1 single fab maker to make all there products for next years product portfolio. Hopefully they are able to get the LPP version for there main products Zen and Arctic Islands.

On the other hand Nvidia are probably jumping to GF as TSMC is only just getting around to 16nm after many delays. If they do stay on TSMC amd get on 16nm, AMD have a possible node lead and may give them advantages in more ways than one. If Nvidia go to GF, I really hope they can cope with all that demand as AMD and Nvidia are only fractions of there customers but having both with all those chips will really stress supply.

Source: http://wccftech.com/globalfoundries-14nm-finfet-amd-zen-2016/
I realize WCCF isn't the most reliable source, however with an exclusive interview with GF, it's hard to discredit the article.
 
Last edited:
Polite Malware gets into systems such as routers and internet-of-things devices by brute-forcing passwords and via Telnet. Once the Malware is in...

It pushes several security updates to the devices, of which most appear to be based on ARM architecture, in an attempt to make them more secure.

http://www.symantec.com/connect/blogs/there-internet-things-vigilante-out-there

I am truly loving this. It is a true gentleman's Malware. "I see your unsecured there squire, please let me hack and help you out!"
 
Microsoft has bought Havok, the creators behind the famously popular Havok Physics Engine from Intel. For those who don't know, Havok has been in use for over 600 game titles such as Assassin's Creed, Call of Duty, Destiny, Dark Souls, The Elder Scrolls and Microsoft's own Halo. No financial details have been released as of yet but I wouldn't expect anything over $100M. Since the acquisition has taken place, many people were asking how MS would handle the licensing Havok/Intel have done in the past for studios who wanted to use it.

Here is what MS have said about licensing and reasons behind there purchase.
We will continue to license Havok’s technology to the broad AAA games industry. This also means that we will continue to license Havok’s technology to run across various game consoles including Sony and Nintendo.

We believe that Havok is a fantastic addition to Microsoft’s existing tools and platform components for developers, including DirectX 12, Visual Studio, and Microsoft Azure. Microsoft’s acquisition of Havok continues our tradition of empowering developers by providing them with the tools to unleash their creativity to the world. We will continue to innovate for the benefit of development partners. Part of this innovation will include building the most complete cloud service, which we’ve just started to show through games like Crackdown 3.

I'm glad they are allowing Havok to still license there tech as Havok is quite a powerful physics engine. However I'm curious to know if the cost has now risen to license it since MS is the new owner. I'm also curious because now that they own it, all there future Exclusives or PC games will probably be getting to use it for free which would help lower the cost of development and potentially even letting some studios license it for free so they can make deals too use MS Azure cloud software to aid in processing which could lower cost of entry and get more people to make more games which would only benefit everyone.

Source: http://www.engadget.com/2015/10/02/microsoft-buys-havok-physics/
 
DARPA are doing some amazing things with liquid cooling -

Winter is coming (to microchips)! Using microfluidic passages cut directly into the backsides of production field-programmable gate array devices, researchers at Georgia Tech are putting liquid cooling right where it’s needed the most – a few hundred microns away from where the transistors are operating. Combined with connection technology that operates through structures in the cooling passages, the new technologies could allow development of denser and more powerful integrated electronic systems that would no longer require heat sinks or cooling fans on top of the integrated circuits. The work is funded by DARPA's ICECool program. For more information visit: http://www.ece.gatech.edu/media/news/release.php?nid=455491

12046602_10153852982837150_8912583608255355334_n.jpg
 
Rumor has it that NVIDIA is to drop the manufacture of the 2GB GTX 960, this as a result of direct competition from AMDs R9 380 which also packs 2GB of memory and a 384-bit Memory Bus giving considerably more umph than NVIDIAs 128-bit on the 960...

29095732241l.JPG


So is this a good move on NVIDIAs part and what effect do we think this will have on the price of the current 4GB GTX 960, will they drop it to be more competitive with AMD, we'll have to wait and see.
 
Rumor has it that NVIDIA is to drop the manufacture of the 2GB GTX 960, this as a result of direct competition from AMDs R9 380 which also packs 2GB of memory and a 384-bit Memory Bus giving considerably more umph than NVIDIAs 128-bit on the 960...

29095732241l.JPG


So is this a good move on NVIDIAs part and what effect do we think this will have on the price of the current 4GB GTX 960, will they drop it to be more competitive with AMD, we'll have to wait and see.

I hope so. I know more people are still on 1080 than higher res where memory bandwidth becomes more of an issue but it's a good move by Nivida and proof that AMD still has some impact on the Green team. Exactly why we need them to stay in business.
 
Rumor has it that NVIDIA is to drop the manufacture of the 2GB GTX 960, this as a result of direct competition from AMDs R9 380 which also packs 2GB of memory and a 384-bit Memory Bus giving considerably more umph than NVIDIAs 128-bit on the 960...

29095732241l.JPG


So is this a good move on NVIDIAs part and what effect do we think this will have on the price of the current 4GB GTX 960, will they drop it to be more competitive with AMD, we'll have to wait and see.

I doubt they would outright stop selling these cards. If anything they would release a 960ti
 
I doubt they would outright stop selling these cards. If anything they would release a 960ti
That's not the issue or the idea, it's about competition. There is very little point just adding another card to the line up to compete with the R9 380 when the 4GB 960 does the job just fine.
 
AMD suffer another blow as they loose another high level member of staff.

http://www.theregister.co.uk/2015/10/14/phil_rogers_amd_nvidia/

Rogers left AMD this month to become Nvidia's compute server architect in Austin, Texas. He was with graphics chip biz ATI from October 1994 to 2006, when the company was gobbled up by AMD. At AMD, Rogers oversaw system architecture and performance.


Hopefully this doesn't have a negative impact on AMD but since they are also laying off another 500 members of staff, it looks like AMD could be feeling the squeeze abit.
 
Nvidia drivers will soon only be available through GeForce Experience :eek:

Soon, you won't be able to get Game Ready drivers for your GeForce graphics card unless you hand your email address over to Nvidia.
Nvidia’s pushing out a new beta version of its slick GeForce Experience software Thursday, building atop the awesome update from last month that lets you play local co-op games with your faraway friends—even if they don’t have gaming machines.

The update adds the ability to broadcast your games to both Twitch and YouTube Gaming at a buttery-smooth 60 frames per second at 1080p resolution. You can now stream games from your GeForce-equipped PC to an Nvidia Shield device at up to 60 fps at 4K resolution, and that’s with 5.1-channel surround sound, too. It’s all wonderful stuff, pushing Nvidia’s class-leading GeForce Experience software even further out in front of the competition, especially if you’re all-in on Nvidia’s ecosystem.

But what’s coming today isn’t the real news, even if it’s welcome news. The real news is what’s coming in December—or rather, what’s not coming after December.

Of drivers and single-source destinations
One of the key weapons in Nvidia’s arsenal against AMD is its deluge of Game Ready drivers. Virtually every major PC game release in the past two years has been accompanied by a day one, WHQL-certified Game Ready driver from Nvidia, designed to make the latest and greatest games run wonderfully on GeForce graphics cards. They’re great!

Sometime in mid-December, however, you’ll be able to install Game Ready drivers only via GeForce Experience—and even then only after you’ve registered a verified email address with Nvidia. The drivers you can grab on GeForce.com or via Windows Update will be limited to quarterly releases for bug fixes, new features, security updates and so on.

Locking performance-enhancing drivers that have always been freely available behind a registration wall chafes—hard—but Nvidia says the change will reduce headaches for both casual and hardcore gamers, as well as continue to push GeForce Experience as a go-to PC gaming solution.

“We kind of have two camps in terms of gamers,” Nvidia’s Sean Pelletier said in a group call with journalists. “On one hand you have the gamer that’s just casually playing things here and there, using their system for daily use and gaming on the side. They don’t want to be inundated with these [Game Ready] drivers…

“On the other side of the equation you have enthusiast gamers, who get excited about preloading a game, who want to play a game the day it comes out with all the bells and whistles,” Pelletier continued. “That’s obviously the demographic we’re looking at for Game Ready drivers. We’re targeting GFE as a single-source destination for those gamers.”
GeForce Experience has long functioned as a control hub of sorts for Nvidia users, offering one-click game optimization, easy driver downloads, the impressive Shadowplay video capture tool, the ability to stream PC games to Shield devices, and more. It’s great!—just like Nvidia’s Game Ready drivers. But while Nvidia reps tried to downplay the upcoming registration requirement by pointing out that “mid- to high-90 percent” of Nvidia owners already apply updates via GeForce Experience, the fact stands that currently, you can bask in all those value-adding features—and drivers—without ever having to register with Nvidia or officially log into GFE.

Nvidia plans to add more functionality to GFE in its quest to make the software a “single-source destination” for PC gamers, however. PC gaming news will make its way into GeForce Experience, as well as hardware giveaways and early access to games. It’s easy to envision Nvidia leveraging GeForce Experience to pass out codes for beta access to games, rather than relying on website-based giveaways as it did with the recent Rainbow Six Siege beta. One day, the free games that Nvidia bundles with its cards could even conceivably be delivered via GFE, similar to how Nvidia offered free Witcher 3 copies to Titan X owners earlier this year. Game-based goodies like that would basically require you to log in to register, anyway—as they have in the past.

But locking Game Ready drivers away unless you hand over your email address to Nvidia just feels icky—like an overreach that benefits Nvidia more than actual gamers. Alas, most gamers will likely wind up handing over the info, even if they grumble. The allure of Nvidia’s delicious Game Ready drivers is just too great, and once you’ve already invested in a Green Team graphics card, you’re likely to stick with it for a few years before upgrading. You can’t leave performance on the table for that long.

And it certainly feels like Nvidia knows it.

Source - http://www.pcworld.com/article/2993...s-behind-geforce-experience-registration.html
 
Back
Top