For last 2 years, I've noticed a worrying trend: the typical budget PCs (especially Laptops) are being sold at higher prices with lower RAM (just 8GB) and lower-end CPUs (and no dedicated GPUs).
Industry mandate should have become 16GB RAM for PCs and 8GB for mobile, since years ago, but instead it is as if computing/IT industry is regressing.
New budget mobiles are being launched with lower-end specs as well (e.g., new phones with Snapdragon Gen 6, UFS2.2). Meanwhile, features that were being offered in budget phones, e.g., wireless charging, NFC, UFS3.1 have silently been moved to the premium mobile segment.
Meanwhile the OSes and software are becoming more and more complex, bloated and more unstable (bugs) and insecure (security loopholes ready for exploits).
It is as if the industry has decided to focus on AI and nothing else.
And this will be a huge setback for humanity, especially the students and scientific communities.
Or, we can expect better from software. Maybe someone can fork Firefox and make it run better, hard cap how much a browser window can use.
The pattern of lazy almost non existent optimization combined with blaming consumers for having weak hardware, needs to stop.
On my 16GB ram lunar lake budget laptop CachyOS( Arch) runs so much smoother than Windows.
This is very unscientific, but using htop , running Chrome/YouTube playing music, 2 browser games and VS code having Git Copilot review a small project, I was only using 6GBs of ram.
For the most part I suspect I could do normal consumer stuff( filing paperwork and watching cat videos) on an 8GB laptop just fine. Assuming I'm using Linux.
All this Windows 11 bloat makes computers slower than they should be. A part of me hopes this pushes Microsoft to at least create a low ram mode that just runs the OS and display manager. Then let's me use my computer as I see fit instead of constantly doing a million other weird things.
We don't *need* more ram. We need better software.
No dedicated GPU is certainly unrelated to whatever been happening for last two years.
It's just in last 5 years integrated GPUs become good enough even for mid-tier gaming let alone running browser and hw accel in few work apps.
And even before 5 years ago majority of dedicated GPUs in relatively cheap laptops was garbage barely better than intrgrated one. Manufacturers mostly put them in there for marketing of having e.g Nvidia dGPU.
I'm gonna be honest thats not my experience at all. I got a laptop with a modern ryzen 5 CPU four years ago that had an iGPU because "its good enough for even mid-tier gaming!" and it was so bad that I couldn't play 1440p on youtube without it skipping frames. Tried parsec to my desktop PC and it was failing that as well. I returned it and bought a laptop with a nvidia dGPU (low end still, I think it was like a 1050-refresh-refresh equivalent) and haven't had any of those problems. That AMD Vega gpu just couldn't do it.
A dedicated GPU is a red flag for me in a laptop. I do not want the extra power draw or the hybrid graphics sillyness. The Radeon Vega in my ThinkPad is surprisingly capable.
> Industry mandate should have become 16GB RAM for PCs
it was only less than 10 yrs ago that a high end PC would have this level of ram. I think the last decade of cheap ram and increasing core count (and hz) have spoiled a lot of people.
We are just returning back on trend. May be software would be written better now that you cannot expect the average low budget PC to have 32G of ram and 8 cores.
I'm reading this thread on an 11-year-old desktop with 8GB of RAM and not feeling any particular reason to upgrade, although I've priced it out a few times just to see.
Mint 22.x doesn't appear to be demanding any more of my machine than Mint 20.x. Neither is Firefox or most websites, although YouTube chat still leaks memory horrendously. (Of course, download sizes have increased.)
Mint is probably around 0.05% of desktop/laptop users.
I think root comment is looking at the overall picture of what all customers can get for their money, and see it getting worse.
This wasn’t mentioned, but it’s a new thing for everyone to experience, since the general trend of computer hardware is it gets cheaper and more powerful over time. Maybe not exponentially any more, but at least linearly cheaper and more powerful.
The OP is not looking at the static point (of the price of that item), but the trend - ala, the derivative of the price vs quality. It was on a steep upward incline, and now it's flattening.
I've been enjoying running Mint on my terrible spec chromebook - it only has 3GB of RAM, but it rarely exceeds 2GB used with random additions and heavy firefox use. The battery life is obscenely good too, I easily break 20 hours on it as long as I'm not doing something obviously taxing.
Modern software is fine for the most part. People look at browsers using tens of gigabytes on systems with 32GB+ and complain about waste rather than being thrilled that it's doing a fantastic job caching stuff to run quickly.
I think it will be a good thing actually. Engineers, no longer having the luxury of assuming that users have high end system specs, will be forced to actually write fast and efficient software. No more bloated programs eating up RAM for no reason.
The problem is that higher performing devices will still exist. Those engineers will probably keep using performant devices and their managers will certainly keep buying them.
We'll probably end up in an even more bifurcated world where the well off have access to lot of great products and services that most of humanity is increasingly unable to access.
Have the laws of supply and demand been suspended? Capital is gonna pour into memory fabrication over the next year or two, and there will probably be a glut 2-3 years from now, followed by retrenchment and wails that innovation has come to a halt because demand has stalled.
If "performant" devices are not widespread then telemetry will reveal that the app is performing poorly for most users. If a new festure uses more memory and sugnificantly increases the crash rate, it will be disabled.
Apps are optimized for the install base, not for the engineer's own hardware.
Some open source projects use Slack to communicate, which is a real ram hog. Github, especially for viewing large PR discussions, takes a huge amount of memory.
If someone with a low-memory laptop wants to get into coding, modern software-development-related services are incredible memory hogs.
I'm not optimistic that this would be the outcome. You likely will just have poor running software instead. After all, a significant part of the world is already running lower powered devices on terrible connection speeds (such as many parts of Africa).
> a significant part of the world is already running lower powered devices
but you cannot consider this in isolation.
The developed markets have vastly higher spending consumers, which means companies cater to those higher spending customers proportionately more (as profits demand it). Therefore, the implication is that lower spending markets gets less investment and less catered to; after all, R&D spending is still a limiting factor.
If the entirety of the market is running on lower powered devices, then it would get catered for - because there'd be no (or not enough) customers with high powered devices to profit off.
I wonder what we can do to preserve personal computing, where users, not vendors, control their computers? I’m tired of the control Microsoft, Apple, Google, OpenAI, and some other big players have over the entire industry. The software has increasingly become enshittified, and now we’re about to be priced out of hardware upgrades.
The problem is coming up with a viable business model for providing hardware and software that respect users’ ability to shape their environments as they choose. I love free, open-source software, but how do developers make a living, especially if they don’t want to be funded by Big Tech?
DRAM spot prices are something like what they were 4 years ago. Having RAM for cheap is nice. But it doesn't cost an extraordinary amount. I recently needed some RAM and was able to pick up 16x32 DDR4 for $1600. That's about twice as expensive as it used to be but $1600 is pretty cheap for 512 GiB of RAM.
A 16 GiB M4 Mac Mini is $400 right now. That covers any essential use-case which means this is mostly hitting hobbyists or niche users.
Server stuff. Nothing interesting. Supermicro H11 + Epyc 7xxx + RAM. I have a 6x4090 setup for local LLMs and I got myself a 128 GB M4 Max laptop thinking I'd do that, but if I'm being honest I need to get rid of that hardware. It's sitting idle because the SOTA ones are so much better for what I want.
We’ve been able to hold the same price we had at launch because we had buffered enough component inventory before prices reached their latest highs. We will need to increase pricing to cover supplier cost increases though, as we recently did on DDR5 modules.
Note that the memory is on the board for Ryzen AI Max, not on the package (as it is for Intel’s Lunar Lake and Apple’s M-series processors) or on die (which would be SRAM). As noted in another comment, whether the memory is on the board, on a module, or on the processor package, they are all still coming from the same extremely constrained three memory die suppliers, so costs are going up for all of them.
Longer contracts are riskier. The benefit of having cheaper RAM when prices spike is not strong enough to outweigh the downside of paying too much for RAM when prices drop or stay the same. If you’re paying a perpetual premium on the spot price to hedge, then your competitors will have pricing power over you and will slowly drive you out of the market. The payoff when the market turns in your favor just won’t be big enough and you might not survive as a business long enough to see it. There’s also counterparty risk, if you hit a big enough jackpot your upside is capped by what would make the supplier insolvent.
All your competitors are in the same boat, so consumers won’t have options. It’s much better to minimize the risk of blowing up by sticking as closely to spot at possible. That’s the whole idea of lean. Consumers and governments were mad about supply chains during the pandemic, but companies survived because they were lean.
In a sense this is the opposite risk profile of futures contracts in trading/portfolio management, even though they share some superficial similarities. Manufacturing businesses are fundamentally different from trading.
They certainly have contracts in place that cover goods already sold. They do a ton of preorders which is great since they get paid before they have to pay their suppliers. Just like airlines trade energy futures because they’ve sold the tickets long before they have to buy the jet fuel.
the risk is that such longer contracts would then lock you into a higher cost component for longer, if the price drops. Longer contracts only look good in hindsight if ram prices increased (unexpectedly).
> Question: are SoCs with on die memory be effected by this?
SoCs with on-die memory (which is, these days, exclusively SRAM, since I don't think IBM's eDRAM process for mixing DRAM with logic is still in production) will not be effected. SiPs with on-package DRAM, including Apple's A and M series SiPs and Qualcomm's Snapdragon, will be effected -- they use the same DRAM dice as everyone else.
The aforementioned Ryzen AI chip is exactly what you describe, with 128 GB on-package LPDDR5X. I have two of them.
To answer the original question: the Framework Desktop is indeed still at the (pretty inflated) price, but for example the Bosgame mini PC with the same chip has gone up in price.
Apple secured at least a year-worth supply of memory (not in actual chips but in prices).
The bigger the company = longer the contract.
However it will eventually catch up even to Apple.
It is not prices alone due to demand but the manufacturing redirection from something like lpddr in iphones to hbm and what have you for servers and gpu
I have a feeling every single supplier of DRAM is going to be far more interested in long-term contracts with Apple than with (for example) OpenAI, since there's basically zero possibility Apple goes kaput and reneges on their contracts to buy RAM.
I would think so because fab capacity is constrained, and if you make an on-die SoC with less memory, it uses fewer transistors, so you can fit more on a wafer.
Software has gotten bad over the last decade. Electron apps were the start but these days everything seems to be so bloated, right from the operating systems to browsers.
There was a time when apple was hesitant to add more ram to its iPhones and app developers would have to work hard to make apps efficient. Last few years have shown Apple going from 6gb to 12gb so easily for their 'AI' while I consistently see the quality of apps deteriorating on the App Store. iOS 26 and macOS 26 are so aggressive towards memory swapping that loading settings can take time on devices with 6gb ram (absurd). I wonder what else they have added that apps need purging so frequently. 6gb iphone and 8gb M1 felt incredibly fast for the couple of years. Now apparently they are slow like they are really old.
Windows 11 and Chrome are a completely different story. Windows 10 ran just fine on my 8th gen pc for years. Windows 11 is very slow and chrome is a bad experience. Firefox doesn't make it better.
I also find that gnome and cosmic de are not exactly great at memory. A bare minimum desktop still takes up 1.5-1.6gb ram on a 1080p display and with some tabs open, terminal and vscode (again electron) I easily hit 8gb. Sway is better in this regard. I find alacrity sway and Firefox together make it a good experience.
I wonder where we are heading on personal computer software. The processors have gotten really fast and storage and memory even more so, but the software still feels slow and glitchy. If this is the industry's idea of justifying new hardware each year we are probably investing in the wrong people.
Fair. I installed a MIDI composition app recently that was 1.2 GB! Now, it does have some internal synthesis that like uses samples, but only a limited selection of sounds so I think 95% of the bulk is from Electron.
Firefox is set to allocate memory until a certain absolute limit or memory pressure is reached. It will eat memory whether you have 4GB of RAM of 40GB.
Set this to something you find reasonable: `browser.low_commit_space_threshold_percent`
And make sure tab unloading is enabled.
Also, you can achieve the same thing with cgroups by giving Firefox a slice of memory it can grow into.
Positive downstream effect: The way software is built will need to be rethought and improved to utilize efficiencies for stagnating hardware compute. Think of how staggering the step from the start of a console generation to the end used to be. Native-compiled languages have made bounding leaps that might be worth pursuing again.
Alternatively, we'll see a drop in deployment diversity, with more and more functionality shifted to centralised providers that have economies of scale and the resources to optimise.
E.g. IDEs could continue to demand lots of CPU/RAM, and cloud providers are able to deliver that cheaper than a mostly idle desktop.
If that happens, more and more of its functionality will come to rely on having low datacenter latencies, making use on desktops less viable.
Who will realistically be optimising build times for usecases that don't have sub-ms access to build caches, and when those build caches are available, what will stop the median program from having even larger dependency graphs.
I’d feel better about the RAM price spikes if they were caused by a natural disaster and not by Sam Altman buying up 40% of the raw wafer supply, other Big Tech companies buying up RAM, and the RAM oligopoly situation restricting supply.
This will only serve to increase the power of big players who can afford higher component prices (and who, thanks to their oligopoly status, can effectively set the market price for everyone else), while individuals and smaller institutions are forced to either spend more or work with less computing resources.
The optimistic take is that this will force software vendors into shipping more efficient software, but I also agree with this pessimistic take, that companies that can afford inflated prices will take advantage of the situation to pull ahead of competitors who can’t afford tech at inflated prices.
I don’t know what we can do as normal people other than making do with the hardware we have and boycotting Big Tech, though I don’t know how effective the latter is.
Hey sorry, I didn't knew that. I had watched the short form content (https://www.youtube.com/shorts/eSnlgBlgMp8) [Asus is going to save gaming] and I didn't knew that It was a rumour.
Asus doesn't make RAM. That's the whole problem: there are plenty of RAM retail brands, but they are all just selling products that originate from only a couple of actual fabs.
Do they make DRAM? I thought they made compute chips mostly.
If I recall correctly, RAM is even more niche and specialized than the (already quite specialized) general chip manufacturing. The structure is super-duper regular, just a big grid of cells, so it is super-duper optimized.
They (GF) do not make DRAM. They might have an eDRAM process inherited from IBM, but it would not be competitive.
You’re correct that DRAM is a very specialized process. The bit cell capacitors are a trench type that is uncommon in the general industry, so the major logic fabs would have a fairly uphill battle to become competitive (they also have no desire to enter the DRAM market in general).
So far all I am seeing is an increase in prices, so any company claiming it will "ramp up production" here is, in my opinion, just lying for tactical reasons.
Governments need to intervene here. This is a mafia scheme now.
I purchased about three semi-cheap computers in the last ~5 years or so. Looking at the RAM prices, the very same units I bought (!) now cost 2.5x as much as before (here I refer to my latest computer model, from 2 years ago). This is a mafia now. I also think these AI companies should be extra taxed because they cause us economic harm here.
Micron is exiting direct to consumer sales. That doesn't mean their chips couldn't end up in sticks or devices sold to consumers, just that the no-middleman Crucial brand is dead.
Also, even if no Micron RAM ever ended up in consumer hands, it would still reduce prices for consumers by increasing the supply to other segments of the market.
I've been ruminating on this past two years, with life before AI most of the compute staying cheap and pretty much 90% idle , we are finally getting to the point of using all of this compute. We probably will find more algorithms to improve efficiency of all the matrix computations, and with AI bubble same thing will happen that happened with telecom bubble and all the fiber optic stuff that turned out to be drastically over provisioned. Fascinating times!
Imagine someone goes to the supermarket and buys all the tomatoes. Then supermarket owner says I don’t know, he bought all at once so it is a better sale. And he sells the remaining 10% of tomatoes at a huge markup
Except it's still sitting idle in warehouses while datacenters get built. They aren't running yet. Unlike with fiber, GPUs degrade rapidly with use, and for now datacenters need to be practically rebuilt to fit new generations, so we shouldn't expect much reusable hardware to come from this
At this current pace, if "the electorate" doesn't see real benefits to any of this. 2028 is going to be referendum on AI unfortunately.
Whether you like it or not, AI right now is mostly
- high electricity prices
- crazy computer part prices
- phasing out of a lot of formerly high paying jobs
and the benefits are mostly
- slop and chatgpt
Unless OpenAI and co produce the machine god, which genuinely is possible. If most people's interactions with AI are the negative externalities they'll quickly be wondering if ChatGPT is worth this cost.
> they'll quickly be wondering if ChatGPT is worth this cost
They should be, and the answer is obviously no—at least to them. No political or business leader has outlined a concrete, plausible path to the sort of vague UBI utopia that's been promised for "regular folks" in the bullish scenario (AGI, ASI, etc.), nor have they convincingly argued that this isn't an insane bubble that's going to cripple our economy when AGI doesn't happen—a scenario that's looking more and more likely every day.
There is no upside and only downside; whether we're heading for sci-fi apocalypse or economic catastrophe, the malignant lunatics pushing this technology expect to be insulated from consequences whether they end up owning the future light-cone of humanity or simply enjoying the cushion of their vast wealth while the majority suffers the consequences of an economic crash a few rich men caused by betting it all, even what wasn't theirs to bet.
Everybody should be fighting this tooth and nail. Even if these technologies are useful (I believe they are), and even if they can be made into profitable products and sustainable businesses, what's happening now isn't related to any of that.
I hope they do. We live in a time of incredibly centralized wealth & power and AI and particularly "the machine god" has the potential to make things 100x worse and return us to a feudal system if the ownership and profits all go to a few capital owners.
> At this current pace, if "the electorate" doesn't see real benefits to any of this. 2028 is going to be referendum on AI unfortunately.
Not saying this is necessarily a bad prediction for 2028, but I'm old enough to remember when the 2020 election was going to be a referendum on billionaires and big tech monopolies.
For good measure, a bunch of this is funded through money taken directly from the electorates taxes and given to a few select companies, whose leaders then graciously donate to the latest Ballroom grift. Micron, so greedy they thought nothing of shutting down their consumer brand even when it costs them nothing at all, got $6B in Chips Act money in 2024.
I remember the HDD shortage after flooding in Thailand. There was a price surge for a year or so, capacity came back online, and the price slowly eased. If AI crashes, prices might quickly collapse this time. If it doesn't, it'll take time, but new capacity will come online.
Nah. The demand is driven by corporations that hoard hardware in datacenters, starving the market and increasing prices in the process - otherwise known as scalping. Then they sell you back that same hardware, in the form of cloud services, for even higher prices.
The government can't, and doesn't, make infinite money. Govt debt can't grow to infinity either, so if revenue is not increased, unpredictable and unpleasant events will follow. However, the taxation issue is so hopelessly misrepresented and misunderstood that I'm not going to discuss it here.
Wow, this is an economics-free take if I've ever heard one. And calling it scalping feels laughable.
There are multiple RAM providers. Datacenters, many competing ones, are gobbling up RAM because there is currently a huge demand. And, unlike actual ticket scalpers, datacenters perform a very valuable service beyond just reselling the RAM. After all, end users could buy up RAM and GPUs themselves and build their own systems (which is basically what everybody did as recently as the early 00s), but they'd rather rent it because it's much less risky with much less capex.
This is simply run-of-the-mill supply and demand. You might convince me there was something nefarious if there was widespread collusion or monopoly abuse, and while some of the circular dealing is worrisome, it's still a robust market with multiple suppliers and multiple competitors in the datacenter space.
> This is simply run-of-the-mill supply and demand.
That's definitely NOT run-of-the-mill demand because it comes from companies buying hardware at operating loss, which can only be recouped by scalping higher prices at the expense of a starved market.
Demand funded by circular financial agreements and off-the-book debt isn't "run-of-the-mill" by any stretch.
Another way to think about it is a good that we once bought for private use where it sat around underutilized the majority of the time is instead being allocated in data centers where we rent slices of it, allowing RAM to be more efficiently allocated and used.
Yes it sucks that demand for RAM has led to scarcity and higher prices but those resources moving to data centers is a natural consequence of a shareable resource becoming expensive. It doesn’t have to be a conspiracy.
And in the process of doing so they're providing a hugely valuable service by building these complex systems out of lots different components, and people and companies find a lot of value in that service.
Calling this "scalping" is just economically illiterate conspiracy theorizing.
I now consider this a mafia that aims to milk us for more money. This includes all AI companies but also manufacturers who happily benefit from this. It is a de-facto monopoly. Governments need to stop allowing this milking scheme to happen.
"Monopoly" means one seller, so you can't say multiple X makes a monopoly and make sense. You probably mean collusion.
If demand exceeds supply, either prices rise or supply falls, causing shortages. Directly controlling sellers (prices) or buyers (rationing) results in black markets unless enforcement has enough strength and integrity. The required strength and integrity seems to scale exponentially with the value of the good, so it's typically effectively impossible to prevent out-of-spec behavior for anything not cheap.
If everyone wants chips, semiconductor manufacturing supply should be increased. Governments should subsidize domestic semiconductor industries and the conditions for them to thrive (education, etc.) to meet both goals of domestic and economic security, and do it in a way that works.
The alternative is decreasing demand. Governments could hold bounty and incentive programs for building electronics that last a long time or are repairable or recyclable, but it's entirely possible the market will eventually do that.
> If everyone wants chips, semiconductor manufacturing supply should be increased. Governments should subsidize domestic semiconductor industries and the conditions for them to thrive (education, etc.) to meet both goals of domestic and economic security, and do it in a way that works.
If there is already demand at this inflated price, shouldn’t we ask why more capacity is not coming online naturally first?
Technically it's a lot closer to monopsony (Sam Altman/OAI cornering 40% of the market on DRAM in a clever way for his interests that harms the rest of the world that would want to use it). I keep hoping that somehow necessity will spur China to become the mother of invention here and supply product to serve the now lopsided constrained supply given increasing demand but I just don't know how practical it will be.
Well thank th FSM that the article opens right up with buy now! No thanks, I'm kind of burnt out on mindless consumerism, I'll go pot some plants or something.
I highly recommend disabling javascript in your browser.
Yes, it makes many sites "look funny", or maybe you have to scroll past a bunch of screen sized "faceplant" "twitverse" and "instamonetize" icons, but, there are far fewer ads (like none).
And of course some sites won't work at all. That's OK too, I just don't read them. If it's a news article, its almost always available on another site that doesn't require javascript.
I whole-heartedly agree with your recommendation and join in encouraging more adopters of this philosophy and practice.
Life online without javascript is just better. I've noticed an increase in sites that are useful (readable) with javascript disabled. Better than 10 years ago, when broken sites were rampant. Though there are still the lazy ones that are just blank pages without their javascript crutch.
Maybe the hardware/resource austerity that seems to be upon us now will result in people and projects refactoring, losing some glitter and glam, getting lean. We can resolve to slim down, drop a few megs of bloat, use less ram and bandwidth. It's not a problem; it's an opportunity!
In any case, Happy New Year! [alpha preview release]
I would not be able to handle that due to video streaming, web clients for things like email, etc. And some sites I trust (including HN) provide useful functionality with JS (while degrading gracefully).
But I use NoScript and it is definitely a big help.
Industry mandate should have become 16GB RAM for PCs and 8GB for mobile, since years ago, but instead it is as if computing/IT industry is regressing.
New budget mobiles are being launched with lower-end specs as well (e.g., new phones with Snapdragon Gen 6, UFS2.2). Meanwhile, features that were being offered in budget phones, e.g., wireless charging, NFC, UFS3.1 have silently been moved to the premium mobile segment.
Meanwhile the OSes and software are becoming more and more complex, bloated and more unstable (bugs) and insecure (security loopholes ready for exploits).
It is as if the industry has decided to focus on AI and nothing else.
And this will be a huge setback for humanity, especially the students and scientific communities.
The pattern of lazy almost non existent optimization combined with blaming consumers for having weak hardware, needs to stop.
On my 16GB ram lunar lake budget laptop CachyOS( Arch) runs so much smoother than Windows.
This is very unscientific, but using htop , running Chrome/YouTube playing music, 2 browser games and VS code having Git Copilot review a small project, I was only using 6GBs of ram.
For the most part I suspect I could do normal consumer stuff( filing paperwork and watching cat videos) on an 8GB laptop just fine. Assuming I'm using Linux.
All this Windows 11 bloat makes computers slower than they should be. A part of me hopes this pushes Microsoft to at least create a low ram mode that just runs the OS and display manager. Then let's me use my computer as I see fit instead of constantly doing a million other weird things.
We don't *need* more ram. We need better software.
Windows OS and Surface (CoPilot AI-optimized) hardware have been combined in the "Windows + Devices" division.
It's just in last 5 years integrated GPUs become good enough even for mid-tier gaming let alone running browser and hw accel in few work apps.
And even before 5 years ago majority of dedicated GPUs in relatively cheap laptops was garbage barely better than intrgrated one. Manufacturers mostly put them in there for marketing of having e.g Nvidia dGPU.
it was only less than 10 yrs ago that a high end PC would have this level of ram. I think the last decade of cheap ram and increasing core count (and hz) have spoiled a lot of people.
We are just returning back on trend. May be software would be written better now that you cannot expect the average low budget PC to have 32G of ram and 8 cores.
Mint 22.x doesn't appear to be demanding any more of my machine than Mint 20.x. Neither is Firefox or most websites, although YouTube chat still leaks memory horrendously. (Of course, download sizes have increased.)
I think root comment is looking at the overall picture of what all customers can get for their money, and see it getting worse.
This wasn’t mentioned, but it’s a new thing for everyone to experience, since the general trend of computer hardware is it gets cheaper and more powerful over time. Maybe not exponentially any more, but at least linearly cheaper and more powerful.
A $999 MacBook Air today is vastly better than the same $999 MacBook Air 5 years ago (and even more so once you count inflation).
Modern software is fine for the most part. People look at browsers using tens of gigabytes on systems with 32GB+ and complain about waste rather than being thrilled that it's doing a fantastic job caching stuff to run quickly.
We'll probably end up in an even more bifurcated world where the well off have access to lot of great products and services that most of humanity is increasingly unable to access.
Apps are optimized for the install base, not for the engineer's own hardware.
That's like 100B+ instructions on a single core of your average superscalar CPU.
I can't wait for maps loading times being measured in percentage of trip time.
On the bright side, I'm not responsible for the UI abominations people seem to complain about WRT laptop specs.
If someone with a low-memory laptop wants to get into coding, modern software-development-related services are incredible memory hogs.
but you cannot consider this in isolation.
The developed markets have vastly higher spending consumers, which means companies cater to those higher spending customers proportionately more (as profits demand it). Therefore, the implication is that lower spending markets gets less investment and less catered to; after all, R&D spending is still a limiting factor.
If the entirety of the market is running on lower powered devices, then it would get catered for - because there'd be no (or not enough) customers with high powered devices to profit off.
The problem is coming up with a viable business model for providing hardware and software that respect users’ ability to shape their environments as they choose. I love free, open-source software, but how do developers make a living, especially if they don’t want to be funded by Big Tech?
A 16 GiB M4 Mac Mini is $400 right now. That covers any essential use-case which means this is mostly hitting hobbyists or niche users.
Looks like the frame.work desktop with Ryzen 128GB is shipping now at same price it was on release, Apple is offering 512GB Mac studios
Are snapdragon chips the same way?
Note that the memory is on the board for Ryzen AI Max, not on the package (as it is for Intel’s Lunar Lake and Apple’s M-series processors) or on die (which would be SRAM). As noted in another comment, whether the memory is on the board, on a module, or on the processor package, they are all still coming from the same extremely constrained three memory die suppliers, so costs are going up for all of them.
All your competitors are in the same boat, so consumers won’t have options. It’s much better to minimize the risk of blowing up by sticking as closely to spot at possible. That’s the whole idea of lean. Consumers and governments were mad about supply chains during the pandemic, but companies survived because they were lean.
In a sense this is the opposite risk profile of futures contracts in trading/portfolio management, even though they share some superficial similarities. Manufacturing businesses are fundamentally different from trading.
They certainly have contracts in place that cover goods already sold. They do a ton of preorders which is great since they get paid before they have to pay their suppliers. Just like airlines trade energy futures because they’ve sold the tickets long before they have to buy the jet fuel.
the risk is that such longer contracts would then lock you into a higher cost component for longer, if the price drops. Longer contracts only look good in hindsight if ram prices increased (unexpectedly).
SoCs with on-die memory (which is, these days, exclusively SRAM, since I don't think IBM's eDRAM process for mixing DRAM with logic is still in production) will not be effected. SiPs with on-package DRAM, including Apple's A and M series SiPs and Qualcomm's Snapdragon, will be effected -- they use the same DRAM dice as everyone else.
To answer the original question: the Framework Desktop is indeed still at the (pretty inflated) price, but for example the Bosgame mini PC with the same chip has gone up in price.
"dice" is the plural for the object used as a source of randomness, but "dies" is the plural for other noun uses of "die".
The bigger the company = longer the contract.
However it will eventually catch up even to Apple.
It is not prices alone due to demand but the manufacturing redirection from something like lpddr in iphones to hbm and what have you for servers and gpu
https://www.google.com/amp/s/www.indiatoday.in/amp/technolog...
Presumably the boom times are the main reason why investment goes into it so that years later, consumers can buy for cheap.
next stage is paving everything with solar panels.
There was a time when apple was hesitant to add more ram to its iPhones and app developers would have to work hard to make apps efficient. Last few years have shown Apple going from 6gb to 12gb so easily for their 'AI' while I consistently see the quality of apps deteriorating on the App Store. iOS 26 and macOS 26 are so aggressive towards memory swapping that loading settings can take time on devices with 6gb ram (absurd). I wonder what else they have added that apps need purging so frequently. 6gb iphone and 8gb M1 felt incredibly fast for the couple of years. Now apparently they are slow like they are really old.
Windows 11 and Chrome are a completely different story. Windows 10 ran just fine on my 8th gen pc for years. Windows 11 is very slow and chrome is a bad experience. Firefox doesn't make it better.
I also find that gnome and cosmic de are not exactly great at memory. A bare minimum desktop still takes up 1.5-1.6gb ram on a 1080p display and with some tabs open, terminal and vscode (again electron) I easily hit 8gb. Sway is better in this regard. I find alacrity sway and Firefox together make it a good experience.
I wonder where we are heading on personal computer software. The processors have gotten really fast and storage and memory even more so, but the software still feels slow and glitchy. If this is the industry's idea of justifying new hardware each year we are probably investing in the wrong people.
Set this to something you find reasonable: `browser.low_commit_space_threshold_percent`
And make sure tab unloading is enabled.
Also, you can achieve the same thing with cgroups by giving Firefox a slice of memory it can grow into.
E.g. IDEs could continue to demand lots of CPU/RAM, and cloud providers are able to deliver that cheaper than a mostly idle desktop.
If that happens, more and more of its functionality will come to rely on having low datacenter latencies, making use on desktops less viable.
Who will realistically be optimising build times for usecases that don't have sub-ms access to build caches, and when those build caches are available, what will stop the median program from having even larger dependency graphs.
This will only serve to increase the power of big players who can afford higher component prices (and who, thanks to their oligopoly status, can effectively set the market price for everyone else), while individuals and smaller institutions are forced to either spend more or work with less computing resources.
The optimistic take is that this will force software vendors into shipping more efficient software, but I also agree with this pessimistic take, that companies that can afford inflated prices will take advantage of the situation to pull ahead of competitors who can’t afford tech at inflated prices.
I don’t know what we can do as normal people other than making do with the hardware we have and boycotting Big Tech, though I don’t know how effective the latter is.
A dad comes home and tells his kid, “Hey, vodka’s more expensive now.” “So you’re gonna drink less?” “Nope. You’re gonna eat less.”
Prices are already through the roof...
https://www.tomsguide.com/news/live/ram-price-crisis-updates
2028 is another story depending on whether this frenzy continues / fabs being built (don’t know whether they are as hard as cpu)
So lets see if they might "save us"
https://www.tomshardware.com/pc-components/dram/no-asus-isnt...
My bad
And a couple of smaller ones: CXMT (if you’re not afraid of the sanctions), Nanya, and a few others with older technology
If I recall correctly, RAM is even more niche and specialized than the (already quite specialized) general chip manufacturing. The structure is super-duper regular, just a big grid of cells, so it is super-duper optimized.
You’re correct that DRAM is a very specialized process. The bit cell capacitors are a trench type that is uncommon in the general industry, so the major logic fabs would have a fairly uphill battle to become competitive (they also have no desire to enter the DRAM market in general).
Governments need to intervene here. This is a mafia scheme now.
I purchased about three semi-cheap computers in the last ~5 years or so. Looking at the RAM prices, the very same units I bought (!) now cost 2.5x as much as before (here I refer to my latest computer model, from 2 years ago). This is a mafia now. I also think these AI companies should be extra taxed because they cause us economic harm here.
Isn't Micron stopping all consumer RAM production? So their factories won't help anyway.
Also, even if no Micron RAM ever ended up in consumer hands, it would still reduce prices for consumers by increasing the supply to other segments of the market.
Whether you like it or not, AI right now is mostly
- high electricity prices - crazy computer part prices - phasing out of a lot of formerly high paying jobs
and the benefits are mostly - slop and chatgpt
Unless OpenAI and co produce the machine god, which genuinely is possible. If most people's interactions with AI are the negative externalities they'll quickly be wondering if ChatGPT is worth this cost.
They should be, and the answer is obviously no—at least to them. No political or business leader has outlined a concrete, plausible path to the sort of vague UBI utopia that's been promised for "regular folks" in the bullish scenario (AGI, ASI, etc.), nor have they convincingly argued that this isn't an insane bubble that's going to cripple our economy when AGI doesn't happen—a scenario that's looking more and more likely every day.
There is no upside and only downside; whether we're heading for sci-fi apocalypse or economic catastrophe, the malignant lunatics pushing this technology expect to be insulated from consequences whether they end up owning the future light-cone of humanity or simply enjoying the cushion of their vast wealth while the majority suffers the consequences of an economic crash a few rich men caused by betting it all, even what wasn't theirs to bet.
Everybody should be fighting this tooth and nail. Even if these technologies are useful (I believe they are), and even if they can be made into profitable products and sustainable businesses, what's happening now isn't related to any of that.
Not saying this is necessarily a bad prediction for 2028, but I'm old enough to remember when the 2020 election was going to be a referendum on billionaires and big tech monopolies.
https://en.wikipedia.org/wiki/Bullwhip_effect
Nah. The demand is driven by corporations that hoard hardware in datacenters, starving the market and increasing prices in the process - otherwise known as scalping. Then they sell you back that same hardware, in the form of cloud services, for even higher prices.
More explanations here:
https://news.ycombinator.com/item?id=46416934
A bunch of problems pop up when a handful of super wealthy corporations can control markets.
There are multiple RAM providers. Datacenters, many competing ones, are gobbling up RAM because there is currently a huge demand. And, unlike actual ticket scalpers, datacenters perform a very valuable service beyond just reselling the RAM. After all, end users could buy up RAM and GPUs themselves and build their own systems (which is basically what everybody did as recently as the early 00s), but they'd rather rent it because it's much less risky with much less capex.
This is simply run-of-the-mill supply and demand. You might convince me there was something nefarious if there was widespread collusion or monopoly abuse, and while some of the circular dealing is worrisome, it's still a robust market with multiple suppliers and multiple competitors in the datacenter space.
That's definitely NOT run-of-the-mill demand because it comes from companies buying hardware at operating loss, which can only be recouped by scalping higher prices at the expense of a starved market.
Demand funded by circular financial agreements and off-the-book debt isn't "run-of-the-mill" by any stretch.
Another way to think about it is a good that we once bought for private use where it sat around underutilized the majority of the time is instead being allocated in data centers where we rent slices of it, allowing RAM to be more efficiently allocated and used.
Yes it sucks that demand for RAM has led to scarcity and higher prices but those resources moving to data centers is a natural consequence of a shareable resource becoming expensive. It doesn’t have to be a conspiracy.
Calling this "scalping" is just economically illiterate conspiracy theorizing.
https://www.merriam-webster.com/dictionary/oligopoly
e.g., the Phoebus cartel https://en.wikipedia.org/wiki/Phoebus_cartel
If demand exceeds supply, either prices rise or supply falls, causing shortages. Directly controlling sellers (prices) or buyers (rationing) results in black markets unless enforcement has enough strength and integrity. The required strength and integrity seems to scale exponentially with the value of the good, so it's typically effectively impossible to prevent out-of-spec behavior for anything not cheap.
If everyone wants chips, semiconductor manufacturing supply should be increased. Governments should subsidize domestic semiconductor industries and the conditions for them to thrive (education, etc.) to meet both goals of domestic and economic security, and do it in a way that works.
The alternative is decreasing demand. Governments could hold bounty and incentive programs for building electronics that last a long time or are repairable or recyclable, but it's entirely possible the market will eventually do that.
If there is already demand at this inflated price, shouldn’t we ask why more capacity is not coming online naturally first?
I highly recommend disabling javascript in your browser.
Yes, it makes many sites "look funny", or maybe you have to scroll past a bunch of screen sized "faceplant" "twitverse" and "instamonetize" icons, but, there are far fewer ads (like none).
And of course some sites won't work at all. That's OK too, I just don't read them. If it's a news article, its almost always available on another site that doesn't require javascript.
Life online without javascript is just better. I've noticed an increase in sites that are useful (readable) with javascript disabled. Better than 10 years ago, when broken sites were rampant. Though there are still the lazy ones that are just blank pages without their javascript crutch.
Maybe the hardware/resource austerity that seems to be upon us now will result in people and projects refactoring, losing some glitter and glam, getting lean. We can resolve to slim down, drop a few megs of bloat, use less ram and bandwidth. It's not a problem; it's an opportunity!
In any case, Happy New Year! [alpha preview release]
But I use NoScript and it is definitely a big help.