-  [WT]  [Home] [Manage]

[Return] [Entire Thread] [Last 50 posts]
Posting mode: Reply
Name
Email
Subject   (reply to 57979)
Message
File
File URL
Embed   Help
Password  (for post and file deletion)
  • Supported file types are: GIF, JPG, PNG, WEBM
  • Maximum file size allowed is 5120 KB.
  • Images greater than 300x300 pixels will be thumbnailed.
  • Currently 565 unique user posts.

  • Blotter updated: 2017-02-04 Show/Hide Show All

PBE Shield Stickers and Deagle Boltface Patches On Sale Now!



File 146282170743.jpg - (1.24MB , 2000x1503 , NVIDIA-GeForce-GTX-1080-Dual-SLI.jpg )
57979 No. 57979 ID: 0dcdc8
So from a rig perspective I have been holding out and skipped the 9X series.

I demand to do 4k gaming on my 50" and was not satisfied with the benchmark reports on the VR front of the 980TIs or Titans....

I was betting on pascal to carry the day and suffered in silence.

Oh how waiting has paid off it appears.

Gonna get two of these straight away and laugh at all the 980s flooding the market/trashcan.
Expand all images
>> No. 57980 ID: 82a3e8
I tried to dig into what pascal and the fact that it allows your gpu to do "deep learning" or some such shit.

What does it do exactly?

(Also I thought about building a pc a bit ago, decided to wait cause pascal and cheap big cards)
>> No. 57982 ID: 9dcda2
>>57979
I too have been holding off on a new card.

http://arstechnica.com/gadgets/2016/05/nvidia-gtx-1080-1070-pascal-specs-pricing-revealed/

Sounds badass.
>> No. 57991 ID: d8ab9d
File 146291736525.jpg - (175.57KB , 1280x607 , 8800GTX.jpg )
57991
Im actually not impressed by the 1000 series.
It's a mere evolution. nVidia can do better.
They've done it before. That was 2006.

Some cards of days long past are spoken of as legends. So speak I of the demon that was the 8800GTX.

When I heard the first rumors about Pascal, I saw potential in it that chips since the G80 just didn't have, but in the end, the chip was just not made big or fast enough. And while the lack of HBM is understandable from a corporate perspective, it is ultimately disappointing.

I'll need to see another legend before I'll consider upgrading.
>> No. 57993 ID: 0cb322
So, do they actually have working gpus or are they peddling old chips and woodscrewed shrouds and brackets to boards again?
>> No. 57996 ID: 8be205
  >>57993
Well, id's got Doom running 200fps under Vulkan on Pascal development chips.
>> No. 57997 ID: 53e7c0
I've noticed they're really tugging their rod over the "2x Titan performance", but almost no one is mentioning that's only in VR.
>> No. 58001 ID: d0041a
>Nvidia
Really cute guys, Nvidia is a loser this gen

DX12 is an AMD API, AMD is where it's at now
>> No. 58007 ID: 0dcdc8
File 146314683823.jpg - (38.20KB , 462x461 , BtBuD61CQAA9UPd.jpg )
58007
>>58001
You have any type of benchmarks to back that up or just fanboy hype?
>> No. 58008 ID: 38f673
>>57979
Damn it, now I have to hold off on making my new rig to see how the 1080/70s pan out.
>> No. 58035 ID: d0041a
>>58007
Do you? The 1080 isn't out and I'm not going to stoop to posting cherry picked benchmarks. Nvidia is claiming the 1080 has more power than two 980's in SLI, and they've made bold claims in the past before that have come up short. 970 or Fermi anyone?

Meanwhile AMD is rolling out 8gb HBM2 cards later this year as potentially R9 400 and a Dual GPU HBM card to replace the R9 295X.
The R9 295X blew out the Titan X at 2/3rds of the price, and both the HBM1 Fury and 390X are as capable as the 980 despite whatever cherry picked benchmarks fanboys can come up with.

Like I said, DX12 is AMD's modified Mantle API, an API which performs worse on Nvidia cards.

All of you should be seriously rethinking the 10xx series, HBM is well and good ahead of GDDR
>> No. 58036 ID: f2400b
>>58035
>>58035

>Like I said, DX12 is AMD's modified Mantle API, an API which performs worse on Nvidia cards.
Uhhh. I think you're thinking of Vulkan there, buddy. DirectX 12 is a similar API, but it's not based off of mantle.

The most of rest of what you've said is true, although I'll point out that dual GPU cards aren't really great for standard purposes, what with the whole 'nobody really truly supports that just yet as standard', and Nvidia's use of GDDR5X is due to manufacturing limitations on HBM2. I'll also ask if you've got something more solid RE: Vega10 than rumors suggesting Nvidia's release has caused them to shift up the timetable, because otherwise their main goal is to get the R9 400s (which are not enthusiast grade, they're replacements for the main 200/300 series cards) out at the end of this month. While they'll have major performance boosts it's unlikely they're going to outright beat a Fury or Nano.

I'm as heavily on Team Red as I can be, but let's not make shit up wholecloth like Polaris is the second coming of Jesus. We'll have to wait a little longer for Zen and Vega for that to happen.
>> No. 58040 ID: d0041a
>>58036
Hey I hear you, I'm just tired of AMD consistently putting out
better products at better pricepoints and STILL getting shit on
in sales by Nvidia.

The 290X was a much, much better card than the 970 for example
>> No. 58041 ID: f2400b
  >>58040

The 290X was a good card in everything except 'developers supporting it' and 'power performance'.

Nvidia has, and Intel too actually, the unfortunate tendency to try and do questionably legal/ethical things with the market in order to choke out the competition instead of letting their hardware stand up on its own. Like encouraging developers to do things as in the attached video, which is a great example of games being set up such that they can't run well on AMD cards.
>> No. 58042 ID: d0041a
>>58036
Vulkan is based off of Mantle, DX12 is based off of both - the idea was for an API with a low overhead.

http://arstechnica.co.uk/gaming/2015/08/directx-12-tested-an-early-win-for-amd-and-disappointment-for-nvidia/
From the DirectX wiki and above source:
"Ashes of the Singularity was the first publicly available game to utilise DirectX 12. Testing by Ars Technica in August 2015 revealed slight performance regressions in DirectX 12 over DirectX 11 mode for the Nvidia GeForce 980 Ti, whereas the AMD Radeon R9 290x achieved consistent performance improvements of up to 70% under DirectX 12, in some scenarios the AMD outperformed the more powerful Nvidia under DirectX 12. The performance discrepancies may be due to poor Nvidia driver optimizations for DirectX 12, or even hardware limitations of the card which was optimized for DirectX 11 serial execution, however the exact cause remains unclear."

Tesselation and hairworks were two of the things Nvidia did last time around.
AMD just stole the floor out from under them, DX12 is tailored towards AMD's architecture, and AMD has the tech advantage currently and will have it until NVIDIA adopts HBM2 in earnest 2 generations from now.
The underhanded shit from the past, like paying developers off to optimize for Nvidia, is over. While Nvidia was sleeping AMD cashed in on being the manufacturer of CPU/GPU/APU for the past two Xbox's and got Microsoft to adopt an improved version of their API
>> No. 58045 ID: 0dcdc8
File 14637636715.png - (50.01KB , 980x720 , DX12_003-980x720.png )
58045
>>58042
So that is a single 1080 benched on DX12 in ashes.

To be direct, I like AMD, have owned and still use their processors and their video cards. (5 pc in the house)

They definitely have a place in the market.

For that place to be on top they will need to beat the 1080 in the 4k space (non DX12) to get there.

Given the performance increases that SLI has provided in the past you can expect based on these numbers that dual 1080s will hit in the 55-60 range at 4k DX12.

Clock speed is a bitch.

http://arstechnica.co.uk/gadgets/2016/05/nvidia-gtx-1080-review/

One of the most comprehensive breakdowns I have seen to date.
>> No. 58046 ID: 53e7c0
File 146378616877.png - (587.96KB , 747x932 , 5ad81b66c83cd316e207ddb608dab9d1.png )
58046
>>58045
Considering the fluff pieces they blow up Apple's ass, I've quit reading Ars, and wouldn't consider them reliable info for anything.

In the meantime, enjoy these fine graphs.
>> No. 58048 ID: 0dcdc8
>>58046
>fluff pieces they blow up Apple

Lol yes sir, totally on board with they were open apple cock gobblers. At least now days they gobble the cock in their closet. I remember when there were nothing but apple articles on there.

In their defense I have seen that change a lot. There are no where near as many apple articles.

That link is a 5 page indepth with lots of bench charts.
>> No. 58050 ID: 53e7c0
>>58048
Yeah. Maybe they've changed. I haven't read it for a while.

Anyway, I've found with nVidia and how they wine and dine the industry it's hard to get facts until they actually launch stuff.

3.5GB, PASCAL = 10X MAXWELL, now all these reviewers are quote "$599 MSRP!", even though ages ago the announced release price was $699, etc.

I'm a 9800GT user that nVidia bricked with AMD detection drivers. I sent them an email and said that was the most expensive $250-something they've ever made. I've only built 1 nVidia system since, for my brother.
>> No. 58053 ID: d0041a
File 146416299116.jpg - (103.35KB , 600x647 , DirectX-12-Async-Compute-Ashes-of-the-Singularity-.jpg )
58053
>>58048
Ars, LinusTechTips, and quite a few others cannot be trusted to be unbiased when they get ad money thrown at them.
http://wccftech.com/directx-12-async-compute-nvidia-amd/
Hell even Tom's does it from time to time

980 vs 390X - Pay close attention to the drastic performance boost in DX12 for the 390X in both 1080p and 1600p
http://hexus.net/tech/news/software/85628-ashes-singularity-directx-12-gaming-benchmark/

2k Matchup in Ashes - 780TI +980TI vs 290X + Fury X
https://www.bing.com/images/search?q=ashes+of+the+siungularity+benchmarks&view=detailv2&&id=5618381A07CE7C624F38DC00BE6C0C15B2FA4A26&selectedIndex=64&ccid=S8ykJPjo&simid=608037610568485288&thid=OIP.M4

bcca424f8e8a7e77d330cb5c1d1122co0

R9 Nano vs 970 in Ashes DX11 and DX12 4k
http://www.legitreviews.com/amd-radeon-r9-nano-4gb-video-card-review_171995/7

1080P:
>http://www.computerbase.de/2015-10/ashes-of-the-singularity-neue-directx-12-benchmarks-mit-amd-und-nvidia/#diagramm-ashes-of-the-singularity-directx-12-1920-1080

>http://wccftech.com/nvidia-amd-graphics-cards-performance-boost-ashes-singularity-directx-12-benchmark-latest-drivers/


I've owned 2 Nvidia cards in the past, and will probably own Nvidia again someday.
AMD now has the clear architecture and tech advantage.
They are responsible for DX12, they were involved in the development, they'll be involved in the DX12 updates
They'll be half way complete through development of their second HBM2 line by the time Nvidia has their first HBM release.

AMD is working with Oculus for LiquidVR

AMD has put most of it's eggs into GPU's.
And the biggest threat that DX12 brings to Nvidia's table?
It makes every game no bridge Crossfire compatible, eyefiniti compatible, freescale compatible
Plus, and slightly unrelated, DX12 is designed to perform better multithreaded on AMD CPU's
>> No. 58054 ID: 53e7c0
>>58053
https://consumerist.com/2006/02/06/did-nvidia-hire-online-actors-to-promote-their-products/
>> No. 58057 ID: d0041a
>>58054
The 970 3.5gb fiasco is still fresh in the back of everyone's minds, TitanX was beat out by SLI 980's or the 295x2 yet cost 1300 dollars, plus Nvidia's issues with non-ref's this past generation, while on the AMD side you got non-refs coming out with fans galore/liquid cooled in any color scheme. Some AMD non refs even had custom PCB's ffs.

1080 may be great, or it just may be an OC'd 980TI with 8gb VRAM, either way you would have to be purposefully ignorant to buy Nvidia this generation
>> No. 58060 ID: 8be205
>>58057
XFX, Sapphire, and PowerColor all make use of their own PCB designs pretty heavily alongside the reference PCBs. EVGA and Gigabyte are the only ones on the Big Green side I know that design their own PCBs in-house instead of having Nvidia do the work.
>> No. 58072 ID: 8be205
>>58060
>EVGA and Gigabyte are the only ones on the Big Green side I know that design their own PCBs in-house instead of having Nvidia do the work.

I was wrong, Zotac designs their own PCBs too.
>> No. 58074 ID: 53e7c0
>>58072
they've only been around since '06 and most opinion seems to be that they're junk.
>> No. 58075 ID: 8be205
>>58072
fuck me again, ASUS can do their own PCB designs as well. Their ROG lines are fucking impressive.
>> No. 58087 ID: 0dcdc8
File 146483030478.png - (36.56KB , 250x250 , 19490454.png )
58087
>>57979
>In certain scenarios, according to Radeon Technologies Group Senior Vice President Raja Koduri, one Radeon RX 480 card could hold its own against a single Nvidia GeForce GTX 980 (from last generation), and two could deliver equivalent or better performance as a single GTX 1080 card—for about $200 less.

http://www.pcmag.com/news/344899/amd-targets-midrange-gamers-with-radeon-rx-480-video-card

>In certain scenarios
>certain scenarios

Only in ashes....

We shall see what the benches come back with, but I am not sold.

Nvidia can have 200 dollars more for being top shelf on performance.

I do however see a space for the AMD cards in say HTPCs eventually once they get them down to fanless in two year or maybe even one of those backpack jobs.


I could totally see the AMD cards in this.

http://www.pcworld.com/article/3077543/virtual-reality/i-try-out-msis-backpack-pc-portable-vr-gaming-rig.html

Coupled with this.

https://youtu.be/aThCr0PsyuA

Of course this is about 5-10 years out though so who knows size and power requirements technology will have at that point.
>> No. 58088 ID: 88a6f1
>>57991
You'll be waiting for a long time. It's in Nvidia's interest to try to stretch out iterative performance jumps per generation if there's a lack of competition from the other side.
>> No. 58089 ID: 53e7c0
>>58087
This is the mainstream tier. If you want a 390X or Fury replacement, you're going to have to wait for Vega or Full size Polaris.
>> No. 58099 ID: d4c8ee
File 146518539114.jpg - (175.79KB , 1200x676 , AMD-APU-Slide-1.jpg )
58099
This is certainly a graph.
[Return] [Entire Thread] [Last 50 posts]


Delete post []
Password  
Report post
Reason