GPU Price Tracker
54 points by ushakov 2 months ago | 45 comments- prhn 2 months agoA price tracker should be more sophisticated than just pulling a single listing from Amazon and dumping whatever price a random 3rd party reseller is listing it for.
The prices here are not indicative of the market at all. They're a single sample from the most egregious offender.
More data points for a 5090 Founders
I hope whatever product "United Compute" is selling is more thoughtfully constructed than this.- Amazon $4,481 - StockX $3,447 - eBay $3,500-$4,000
- AtheistOfFail 2 months agoIt's a .ai domain.... they're selling "Wrapped LLM"
- flessner 2 months agoFurthermore, the menu is missing a close button, the components look like shadcn or AI generated, and overall it's not well optimized for mobile.
Also listing Coca Cola in the team section, without indication of a partnership or investment - likely as a joke - is not a smart move.
It looks like - and probably is - a random assortment of projects from a single person, the "branding" is simply not reflecting this.
- yachty66 2 months ago[dead]
- yachty66 2 months ago
- yachty66 2 months ago[dead]
- flessner 2 months ago
- cyberpower1 2 months agoI think this exactly what you are referring to: https://gpuprices.ai/
- bcraven 2 months agoI find skinflint[0] is good for this sort of long-term tracking.
[0] https://skinflint.co.uk/?cat=gra16_512&view=gallery&pg=1&v=e...
- yachty66 2 months ago[dead]
- AtheistOfFail 2 months ago
- Alifatisk 2 months agoI own stocks for Nvidia, I believe they will still climb higher than ever before. But at home, my setup has AMD components because they are more worth it.
I am more into AMD cards than anything, I wish this site also tracked the prices of AMD aswell.
- thebruce87m 2 months ago> I believe they will still climb higher than ever before.
I think this expectation is already priced in. I invested when I saw LLMs kicking off with no reflection in the NVIDIA share price and made 10x when the market caught up.
Now with tariff uncertainty and trump signalling to China (via Russia) that there would be no repercussions for invading Taiwan I’m less convinced there is growth there, but the possibility of massive loss. In the days of meme stocks this might not matter of course.
Note that an invasion of Taiwan would have huge implications everywhere but any company that needs leading edge semiconductors to directly sell their products would be screwed more than others.
- Alifatisk 2 months agoReally? You think Nvidia will go downhill from here?
- Alifatisk 2 months ago
- thebruce87m 2 months ago
- amelius 2 months agoWhy is memory stuck at such low values while applications clearly demand more?
- mckirk 2 months agoMy guess would be 'artificial scarcity for the purpose of market segmentation', because people probably wouldn't buy that many of the expensive professional cards if the consumer cards had a ton of VRAM.
- YetAnotherNick 2 months agoHBM are in very limited supply and NVidia tries to buy all the stock it could find at any price[1][2]. So the memory literally couldn't be increased.
[1]: https://www.nextplatform.com/2024/02/27/he-who-can-pay-top-d...
[2]: https://www.reuters.com/technology/nvidia-clears-samsungs-hb...
- WithinReason 2 months agohow about GDDR?
- WithinReason 2 months ago
- karmakaze 2 months agoI'm surprised the RTX cards don't have a Terms of Use that prohibits running CUDA on them. They already removed NVLink from the 40-series onward. Maybe running 8k VR could use the 32GB on the 5090 but I can't imagine much else that's not compute.
I'm looking forward to newer APUs with onboard 'discrete' GPUs and quad or more channel LPDDR5X+ and 128GB+ unified memory that costs less than an M3 Ultra.
- cubefox 2 months agoNvidia indeed does this with the *60 cards, which are limited to 8 GB. They probably copied this upselling strategy from Apple laptops.
- nacs 2 months agoExcept now, Apple with it's shared VRAM/RAM model now has better deals especially past 24GB of VRAM than you get with Nvidia now (for inference at least).
A Macbook or Mac Mini with 32GB as a whole system is now cheaper than a 24GB Nvidia card.
- nacs 2 months ago
- amelius 2 months agoOk, time to start supporting another brand folks.
- YetAnotherNick 2 months ago
- sokoloff 2 months agoApplications that consumers use (games and desktop) work fine with the amount of memory that consumer GPUs have.
GPUs targeting more RAM-hungry applications exist, but they’re quite a bit more expensive, so people who play games buy gaming GPUs while people who need more VRAM buy cards targeting that application.
Why would a consumer want to pay for 40GB of VRAM if 12GB will do everything they need?
- Const-me 2 months ago> work fine with the amount of memory that consumer GPUs have
Most consumers buy GPUs to play videogames. Recently, nVidia launched two flavors of 5060 Ti consumer GPU with 8GB and 16GB memory, the cards are otherwise identical.
Apparently, the 8GB version is only good for 1080p resolution with no DLSS. In many games, the difference between these versions is very substantial. The article says 8GB version is deprecated right at launch: https://videocardz.com/newz/nvidia-geforce-rtx-5060-ti-with-...
- sokoloff 2 months agoIt looks like the 8GB cards are about $60 (10-12%) cheaper than the 16GB cards.
I sure don't want a world where we only have 32GB 5090s and nVidia reaching farther down the price-vs-performance curve to offer consumers a more affordable (but lower performing) choice seems like a good, rather than a bad, thing to me. (I genuinely don't see the controversy here.)
- sokoloff 2 months ago
- Const-me 2 months ago
- os2warpman 2 months agoThere are supply constraints at almost every single step in the GPU supply chain.
An earthquake three months ago, production issues, and insatiable demand mean that every single GDDR/HBM chip being made at factories already operating at maximum capacity has been sold to a waiting customer.
If Nvidia wanted to double the amount of VRAM on their products, the only thing that would happen is the supply of finished products would be halved.
No amount of money can fix it, only time.
- mckirk 2 months ago
- throwawayffffas 2 months agoWhat about AMD card?
- the__alchemist 2 months agoCUDA.
- yachty66 2 months ago[dead]
- kubb 2 months agoThe website has an ".ai" domain. It's about people wanting to run inference, and maybe mine cryptocurrency and for some reason only NVIDIA cards are used for that.
- throwawayffffas 2 months agoYou can run inference on AMD cards, ROCm[1] is a thing. I am running inference on amd cards locally.Plus the highest performing cards for computational workloads are AMD's[2] of course you can't buy these on amazon.
1. https://rocm.docs.amd.com/en/latest/index.html 2. https://www.amd.com/en/products/accelerators/instinct/mi300....
- IshKebab 2 months agoSome reason: CUDA
- throwawayffffas 2 months ago
- the__alchemist 2 months ago
- jsight 2 months agoIt really is amazing how much these have increased. NVidia 3090 for almost as much as the MSRP for 5090? Incredible!
- the__alchemist 2 months agoThis is now giving me scarcity mindset; when prices go back to normal, I'll only buy top tier; last longer before needing to UG. Screwed that one up last time; bought a 4080 when the window opened for a few weeks. (You could just buy them direct from Nvidia's page for a bit)
- mciancia 2 months ago3090 is like 600-800 USD used and basically no new stock.
They have shit data since Amazon doesn't really sell most of those cards and they do no validation
- the__alchemist 2 months ago
- Obertr 2 months agoIt would be nice to have something like a score to indicate how powerful it is, determined by the price, to see which one is kind of the best.
Neat: when clicking on the name, I would like to be redirected to Amazon. The link on the far right was hard to find. :)
- tossandthrow 2 months agoNot really GPU tracker - more like Nvidia card comparison.
AMD has some really interesting things on the drawing board, and Apple should definitely be in the mix.
- ein0p 2 months agoStrange specs table - it seems to ignore the tensor core FLOPs, which is what you'd be using most of the time if you're interested in computational throughput.
- frognumber 2 months agoI'm curious what's responsible for the current uptick.
- disqard 2 months agoI typed in "RTX A2000" and got zero results.
I guess this website is for folks who're trying to equip a data center in their backyard with H100s?
- jerryseff 2 months agoAny plans to make this available as an API / link to common purchase links for each to back the "live" pricing data?
- yachty66 2 months ago[dead]
- yachty66 2 months ago
- dsign 2 months agoMaybe Moore's law is dead, but its sister about the doubling of the computing hunger every year seems to be well and fine. And wait until we get bored of using GPUs to make artists starve and finally leverage AI for something fundamentally useful--or terrible--, like helping billionaires live forever....
- casey2 2 months agoLAWL "It's not a bubble"
- RolandTheDragon 2 months agowould love to see this become an arena where I can let my local GPU "fight" against other ones