How did it come to be that only two companies supply all of the world's PC graphics chips?
from 58008@lemmy.world to nostupidquestions@lemmy.world on 01 Sep 18:18
https://lemmy.world/post/35307984

I know Intel is dipping its toe into the GPU market, but let’s be real, AMD and nVidia are the only options and have been for the last 20+ years. The manufacturers/assemblers of the complete graphics cards are varied and widespread, but the core tech comes from two companies only.

Why is this the case? Or am I mistaken and am just brainwashed by marketing, and there are in fact other viable options for GPUs?

Cheers!

#nostupidquestions

threaded - newest

slazer2au@lemmy.world on 01 Sep 18:34 next collapse

Market consolidation. Easier to compete when you have 1 or 2 competitors rather then 5.

Kyrgizion@lemmy.world on 01 Sep 18:35 next collapse

It’s not even really “two companies”. Nvidia has 92% of the entire market. And the reason for that is mostly CUDA and its ecosystem which has become widespread among developers.

Buffalox@lemmy.world on 01 Sep 19:10 collapse

I think 90+% marketshare is technically considered a monopoly in many places.
But the existence of AMD still makes a huge difference IMO, you do have an alternative option, and Nvidia doesn’t control the market completely.
Also personally I use AMD because I’m on Linux, and I don’t want the proprietary Nvidia driver to fuck up my system.
So AFAIK on Linux, the majority actually run AMD.

SnotFlickerman@lemmy.blahaj.zone on 01 Sep 19:26 next collapse

All-AMD Linux desktop build here plus all-AMD Linux laptop.

Shit just works.

psoul@lemmy.world on 02 Sep 06:03 next collapse

Sh 🤫 IT just works

massacre@lemmy.world on 02 Sep 06:09 collapse

Which laptop? All AMS here as well. ROG was what I found but there’s a very l in muted set of laptops with AMD GPUs, so I’m curious

SnotFlickerman@lemmy.blahaj.zone on 02 Sep 06:23 collapse

Lenovo Thinkbook 15 G3 ACL

derpgon@programming.dev on 02 Sep 07:54 collapse

Rocking my second Thinkpad with full Linux, it has been a very pleasant experience both times. Sadly, had to switch a year back to from E15 to T16 because the keyboard started failing and it wasn’t servicable from my side. Still works with external one though.

massacre@lemmy.world on 04 Sep 03:52 collapse

How are we with Trust of Lenovo after their previous shenanigans?

LainTrain@lemmy.dbzer0.com on 02 Sep 00:27 collapse

I run Linux on my PC with a 3090 and my server running Debian oldstable has a 1070Ti.

The proprietary driver works ok. Nouveau is a valiant effort but isnt very useful. I personally can’t imagine having an AMD GPU and not being able to play games or do ML or have NVENC for transcoding (e.g. in JF docker)/video editing, AMD is just not a serious company in the GPU space sadly.

Proper competition is sorely needed. Such a shame too, because I’m quite glad to never have to buy an Intel CPU ever again and deal with their ass backwards ecosystem, it would be cool if AMD could pull off such a comeback for GPUs.

Edit: idk what the fuck is wrong with you people for down voting and simping for shit GPUs. The multi-billion dollar company doesn’t care. Stop defending them for shit products and stop attacking me just because I want to play video games once in a blue moon, jesus!

And when I say play games, I think it implies that I mean play games not at 1366x768 AI upscaled frame generated whatever settings but with Path Tracing on proper max settings, or play games not on a small portable, which is cool but no replacement for a proper home setup.

herrvogel@lemmy.world on 02 Sep 04:44 next collapse

Not being able to play games? Not serious? What the hell are you talking about? Did you compare a 3090 to a bottom tier ATI Radeon card or something? My RX 6800 was a fucking champ that was able to run everything I threw at it without a single problem and with quite satisfactory performance at 1440p. It was most definitely a very competent GPU for gaming.

ML and nvenc are extra features that not everyone needs or even wants.

LainTrain@lemmy.dbzer0.com on 02 Sep 08:27 collapse

Maybe I’m misinformed, but can the RX 6800 do Path Tracing in CP2077 at 1440p native/~30fps or at least on DLSS Quality or even FSR Quality (in both upscalers cases with Ray Reconstruction and no frame generator ofc).

Can any AMD GPU do that at all, at a better price to performance point?

Because I was under the impression that basically no AMD cards can do practically any substantial RT of any kind, and that’s why it’s either missing on consoles or borderline unnoticeable and often confined to reflections where it’s pretty useless.

I just briefly looked it up, and all I could find was someone on steam discussions with an 79xx XTX card talking about the old timey simple Ray Tracing being “possible” at 1080p in cyberpunk on steam discussions. Granted there were some videos that looked promising but I steer clear of YT and reddit.

And as for features you don’t need, idk, to me if I have to buy a GPU, might as well get one that should be able to do more things rather than less for the money. I didn’t even play video games much when I got my 3090 years ago during summer 2021 for like $400 used from CeX, but I’m glad I got something that can, and something that can encode video well, whether it’s recording, editing and rendering gaming footage to share with friends, or transcoding media on a server, or do ML (eg for self hosting Immich).

If I were to buy a GPU now, i’d look at all features, even if I don’t want those things now, I’d ofc want them later, and for that - why would you choose an inferior option?

lordbritishbusiness@lemmy.world on 03 Sep 00:25 collapse

A RX 6800? Yes, in fact I was playing CP2077 with FSR and ray tracing on at 4K on a 6800. It was the first real card AMD had that could. Though it did struggle at times.

Sadly replaced after 4 years with a RX9060 of similar capability but better Ray tracing.

AMD cards are only about 2-3 years behind NVIDIA in a lot of specialised tasks, but trying to pace the behemoth that is NVIDIA’s RnD with a much smaller budget. ROCm works but is held up by compatibility issues with the newer CUDA features.

LainTrain@lemmy.dbzer0.com on 03 Sep 10:09 collapse

Ray tracing? No no, I said Path Tracing.

lordbritishbusiness@lemmy.world on 04 Sep 09:03 collapse

Hmm. I’ve decided I don’t like you.

It feels like you’re setting an artificially high goal purely to make challenging your assirtion impossible. I’m not even sure a 3090 can do path tracing. 40/5090, maybe they can.

Could an AMD card do it? Yes. RX 7900 possibly, a theoretical RX 9090 could if they bothered to release one.

But none of that matters really. Never has.

LainTrain@lemmy.dbzer0.com on 05 Sep 19:39 collapse

Yes, a 3090 can do Path Tracing, that’s why I’m asking for one. It’s not a artificially high goal - it’s how I play one of my favourite games of all time - CP2077.

Obviously both the 4090 and 5090 can do that too and better, but that’s essentially out of reach for mere mortals, and even if it weren’t it’s not enough of an upgrade to really justify the massive price tag.

Still, If I were to get a new GPU, I wouldn’t want a graphical downgrade, ideally an upgrade since y’know, that’s what matters in price to value of computer components, and AMD can’t provide so they’re not a serious option. CPUs on the other hand? Hell yeah I’m 100% AMD, but that’s because they have a serious competitive offering in the 9800X3D.

Maybe to you gaming performance doesn’t matter when buying a gaming computer, in which case that’s your opinion and it’s ok, but it’s safe to say it matters to the rest of us.

rapchee@lemmy.world on 02 Sep 05:28 next collapse

lol “not being able to run games” have you heard of a little thing called the steam deck?

Blackmist@feddit.uk on 02 Sep 06:33 next collapse

Or indeed the PS5 which runs AMD chips.

LainTrain@lemmy.dbzer0.com on 02 Sep 08:32 collapse

And runs the few games it does have pretty poorly, all things considered.

survirtual@lemmy.world on 02 Sep 09:59 next collapse

This is nonsense and, frankly, sounds like guerrilla marketing for nvidia.

All things considered, I can play any game I want on the steam deck, which has an old SoC by today’s standards. A newer AMD gpu can run anything at max settings on a linux machine.

So again, either you are grossly misinformed or working for nvidia to sew gentle doubt. Either way, stop it.

LainTrain@lemmy.dbzer0.com on 06 Sep 01:45 collapse

Do you not play many games? Good luck playing CP2077 with PT or heck even MSFS2020 at good (read: max) settings on 1440p.

The deck is great but I can’t even run NMS on high settings on it.

survirtual@lemmy.world on 07 Sep 10:49 collapse

Why would I want to play at max settings? That adds very little to the gameplay for me.

I can play any game tweaking settings, and I can render at 720p + upscale if a game is demanding. This makes nearly any game enjoyable.

High settings are irrelevant, but if you want high settings, any AMD card from the past 2 years will more than deliver max performance for anything you throw at it.

For a handheld, portable device that costs under $500, I am okay reducing graphics quality for portability and gameplay.

Blackmist@feddit.uk on 02 Sep 15:58 collapse

Runs it better than any PC costing a similar amount.

You can barely get an entry level GPU for that.

LainTrain@lemmy.dbzer0.com on 02 Sep 08:25 collapse

I have a steam deck. It’s a portable, it’s really cool for what it is but as a main gaming system it can’t exactly compete visually nor performance wise with being able to run Path Tracing in CP2077 for instance, which I’d say is borderline required to enjoy the game’s visuals.

It’s like comparing the PS3 and the PSP or the N64 and the Gameboy Color. Both are cool, one doesn’t replace the other nor does it have to.

rapchee@lemmy.world on 02 Sep 09:03 collapse

the point was about “not being able to game on amd”
yeah duh mobile chips are less performant, but still forza horizon 4–5 runs better on my steam deck than on my desktop pc with a 2080rtx in linux

FreedomAdvocate@lemmy.net.au on 04 Sep 06:51 next collapse

At the same resolution?

rapchee@lemmy.world on 06 Sep 12:05 collapse

i think you probably meant it the other way, can the deck run it in 1080p, but i just tried it in 720p on my desktop and it’s still choppy, although it does make the average fps higher.
also it’s not unplayable, it’s just annoying, and the knowledge that on windows it can do stable 60 fps with “maximum” settings (that’s below “ultra” and “extreme”) in 1080p
also also, this is a “special” thing, most games run just as smoothly as on windows, forzas, and the “new” mafia games are the few exceptions out of my 1000+ games library

FreedomAdvocate@lemmy.net.au on 06 Sep 23:42 collapse

No, exactly what I said - does your steam deck run the games better at the same resolution as your pc with a 2080 does?

It running them “better” but at a lower resolution is an unfair comparison.

rapchee@lemmy.world on 07 Sep 04:08 collapse

maybe i wasn’t clear, but i tried 720p on both the desktop and deck, and the nvidia desktop was still choppy

FreedomAdvocate@lemmy.net.au on 07 Sep 04:36 collapse

Must have a very weak cpu or some other bottleneck then, as a RTX 2080 is in a whole different league to the GPU in the Steam deck’s APU - like in the 5x better range.

Linux also doesn’t help from all I’ve seen, with terrible drivers for Nvidia gpus.

rapchee@lemmy.world on 07 Sep 05:51 collapse

it is probably the linux nvidia driver, because, as i mentioned before, it runs smoothly on windows on the same machine on higher settings

LainTrain@lemmy.dbzer0.com on 06 Sep 01:47 collapse

Why would I game on shit settings when I can game on good settings for the same/less money?

rapchee@lemmy.world on 06 Sep 12:08 collapse

i’m not sure if you meant like deck vs desktop, but if so, because the deck is a handheld, you can play anywhere

thelittleblackbird@lemmy.world on 02 Sep 07:02 collapse

Sorry, I needed to vote negatively your comment due to the false information. Nothing personal, just keeping the house clean

LainTrain@lemmy.dbzer0.com on 02 Sep 08:23 collapse

What false information?

thelittleblackbird@lemmy.world on 02 Sep 13:45 collapse

AMD gpus are inside in Xbox and ps5, without taking into account the handhelds like steam deck.

For the ML is usually better to use amd cards because they use to have more vram, and many many models can be trained using amd.

And about the transcoding comment I will bother myself ait it.

In summary, tells me you don’t have any clue without telling me you don’t have any clue

FreedomAdvocate@lemmy.net.au on 04 Sep 06:54 collapse

AMD cards are not really better than Nvidia cards at anything. That’s not being a “shill”, it’s just the truth. They’re cheaper and easier to find in stock, that’s about it.

thelittleblackbird@lemmy.world on 04 Sep 10:47 collapse

AMD are not better than Nvidia -> sure

I prefer Nvidia than AMD for everything -> perfect, it is your opinion and a respetable one

With AMD you can not do AAA gaming, ML or just transcoding -> a lie, simply, nothing more to add

And I will ignore the sentence about AMD not being a serious company because it is too absurd to discuss

FreedomAdvocate@lemmy.net.au on 05 Sep 07:06 collapse

I assume you think you’re replying to someone else as I’ve said none of those things other than the first.

thelittleblackbird@lemmy.world on 05 Sep 09:42 collapse

Yep, sorry, my mistake, I confused you with another user

False@lemmy.world on 01 Sep 18:48 next collapse

Intel never really tried to be a real competitor until a few years ago. 3dfx had market dominance in 90s but then basically committed suicide. There were a few other smaller manufacturers in the late 90s and early 2000s but they never really had significant market share and couldn’t keep up with the investment required to be competitive.

Buffalox@lemmy.world on 01 Sep 19:14 next collapse

3dfx had market dominance in 90s but then basically committed suicide.

As I remember it, it was Nvidia that killed 3DFX, Nvidia had an absolutely cutthroat development pace, and 3DFX simply couldn’t keep up, and they ended up being bought by Nvivia.
But oh boy Voodoo graphics were cool when they came out! An absolute revolution to PC gaming.

False@lemmy.world on 01 Sep 19:39 next collapse

Nvidia helped but 3DFX released a couple of bad products in a row.

Buffalox@lemmy.world on 01 Sep 19:57 collapse

IDK, I think it was because they couldn’t keep up with Nvidia, I bough the Voodoo 2 already at about half price.
After that it was basically lights out for Voodoo.

Intel has somewhat the same problem I think, because their GPU reasonably is good and for the customer it’s a competitive product.
But for intel, the GPU chip probably cost 3 times as much to make as for a comparable Nvidia or AMD, because Intel requires a twice as big GPU to be competitive!
That means that Intel is probably not making any profit from their GPU division.
Same with Voodoo, they simply couldn’t keep up to make a profit, they had to compete with Nvidia that quickly surpassed 3DFX, and since Nvidia were better Voodoo had to be cheaper, but they couldn’t make them cheap enough to make a profit from them.

It’s not that Voodoo got worse, because obviously they didn’t. But Nvidia had a development cycle that was unheard of at the time. It wasn’t just 3DFX that couldn’t keep up. It was also S3, Matrox and ATI. And ATI were by far the biggest GPU maker at the time. ATI however made a strong comeback as the only competitor to Nvidia mainstream performance desktop graphics and gaming, and then ATI was later bought by AMD.

mereo@piefed.ca on 01 Sep 20:46 collapse

We cannot forget that 3dfx went under when they bought STB to manufacture their own video cards instead of letting their board partners do it.

Buffalox@lemmy.world on 02 Sep 06:12 collapse

Maybe you are right, but I think they did that because they thought that would help them remain competitive, keeping the profit share that would normally go to board vendors, allowing them to sell cheaper while still making money, and compete better against Nvidia.

Maybe I remember it wrong, but I think Voodoo was already dying with Voodoo 2.

SharkAttak@kbin.melroy.org on 01 Sep 19:57 next collapse

Yeah, I remember Matrix, PowerVR, Number9(?)... But probably R&D became too costly, and despite DirectX leveling the field a bit some were forced to step back, or sell.

mereo@piefed.ca on 01 Sep 20:44 collapse

3dfx had market dominance in 90s but then basically committed suicide.

Very true. They committed suicide when they bought STB so that they could manufacture their own video cards. They didn't just focus on chip R&D, they needed to manufacture and market their own video cards instead of letting board partners do it.

Buffalox@lemmy.world on 01 Sep 18:51 next collapse

You are not mistaken.

In the early days of the PC, there were lot’s of GPU options, as in literally dozens. So the first part of the question is why did they almost all disappear? The answer to that is that it became a much more complicated market with Windows, with way higher demands on the software side, and many hardware vendors suck at making software. So over time the best combo of hardware and driver beat out other high end manufacturers so we ended up with just 2, and the on-board / on-chip GPU made every low end 3rd party GPU next to irrelevant, with very little possibility of making a profit.

The low end chips were no-longer needed, as they can now be had cheaper and more efficiently as part of the CPU for both AMD and Intel. And since these are the only 2 CPU options for X86, Now that VIA has discontinued their X86 line acquired from Cyrix, there is no low end entry point in the PC market for a new maker of GPU.
The natural evolution is to start from a lower end, and if successful work up. This is not possible in the PC market, and makes entry to the market near impossible, except with enormous investments that may never pay off, especially since PC is a dwindling market.

As you mention Intel is dipping their toes, but despite doing a pretty good and big effort, and investing a lot to develop a better GPU, and actually delivering a good product at a reasonable price, that should be absolutely competitive on paper, their marketshare is absolutely minuscule, because Nvidia and AMD together dominate and already fill the needs for the mid to higher end market, and have brand recognition for graphics.

It’s not that there aren’t technologies that possibly could compete if scaled for PC, because those are actually pretty numerous on phones and tablets. But you can’t port these cheaply to PC, because there is no market segment for them to slide in to easily.
It would require major investments to make them actually hardware performance competitive at higher scales, and investments in making good drivers. Intel had a big head start in these aspects, already making on-chip graphics that had drivers already. And still they are struggling, despite delivering a good product, and people have been screaming for a third option,because of high GPU prices.

This may not be the entire explanation, but I think it’s a very significant part of it.
The better question IMO is why Intel never became more popular, considering how much people have raved that more competition in the GPU market is required.

And the explanation for that is:

but let’s be real, AMD and nVidia are the only options

Except Intel actually presented a good alternative, but was never seriously considered by most people for whatever reason.

Personally I didn’t consider Intel, because I remember earlier attempts by Intel, where Intel quickly left the market again. And I didn’t want a GPU where I’m left without driver support a year after I purchased it.
So in my case it was lack of trust in Intel to stay the course. But every other maker would have that exact same issue.
There have been a few attempts in the past from other makers, but they all had performance or driver issues or both, and left the market quickly . Intel delivered a stellar product by comparison. And if Intel drops out of GPU again, I think there’s a pretty big risk it may be our last chance for a third significant mid-high end GPU maker on PC.

TLDR:

  1. All the old competitors couldn’t cut it on either the hardware or software side, and so they died out.
  2. It’s an insanely expensive market to enter and to stay in, with high risk of never making a profit.
[deleted] on 01 Sep 18:55 next collapse
.
jacksilver@lemmy.world on 01 Sep 21:03 next collapse

The other thing you didn’t talk about was the size of the market in general.

As onbaord CPUs were becoming popular the biggest reason for a GPU was games or video processing. Which, while significant markets, isn’t huge.

Over the past couple decades, GPUs have made headway as the way to do Machine Learning/AI. Nvidia spent a lot of time and money making this process easier on their GPUs which lead to them not only owning the graphics market, but the much bigger ML/AI market. And I say the AI/ML market is bigger is simply that they are being installed in huge quantities in data centers.

Edit: My point being that the market shrunk before GPUs became so critical. To counteract Nvidias stranglehold, a lot of big tech companies are creating custom TPUs (Tensor processing units) which are just ML/AI specific chips.

Kyrgizion@lemmy.world on 02 Sep 00:21 collapse
  • Region Q2 2025 Growth Rate Key Factors
  • North America -0.5% Declining consumer demand, tariff concerns
  • EMEA +5.3% Stronger-than-usual seasonal demand
  • APAC Flat Stabilization after previous declines–

Not exactly a dwindling market except in the US

Buffalox@lemmy.world on 02 Sep 06:20 collapse

You don’t write what market you are describing but:
The PC market has been dwindling for decades, the PC gaming market has also been dwindling with consoles taking bigger share, and the past 5-10 years due to high GPU prices that have been wildly unstable.
Lately prices have returned to more normal levels when accounting for inflation, which could explain a bump this year.
USA is an outlier because of tariffs.

AliasVortex@lemmy.world on 01 Sep 19:12 next collapse

The short concise answer is mostly cost. Nvidia, AMD, and Intel are all spending multiple billions of dollars per year in R&D alone. It’s just not a space where someone can invent something in their garage and disrupt the whole industry (like, even if someone were to come out of left field with a revolutionary chip design, they’d need to convince investors that they’d be a better bet than literal trillion dollar companies).

XeroxCool@lemmy.world on 02 Sep 02:09 collapse

The question isn’t just about upstarts, it’s asking how we got here. We can’t start Ovidia in a garage, but Nvidia did at one point. So where’d everyone else go? What partnerships and preferences put Nvidia on top?

despoticruin@lemmy.zip on 02 Sep 03:06 next collapse

Patents and the fact that these chips are massively complex designs. We are talking architecture on the complexity level of the empire state building, most of which is a blend of proprietary designs developed over decades.

Nobody is saying you can’t do it in your garage, in fact it’s easier than ever to start. Let me know how it goes, look into some of the recent tapeout challenges to get an idea of what you are proposing people just make in a garage.

XeroxCool@lemmy.world on 02 Sep 04:17 collapse

You said exactly what the parent comment said and ignored the secondary part of OP’s intent. But thanks?

AliasVortex@lemmy.world on 02 Sep 04:51 next collapse

I was content to let the other comments address the history since I’m not particularly well versed there (and there’s already enough confidently incorrect bullshit in the world). I mostly just wanted to interject on why there aren’t more chip companies beyond just hand waving it away as “market consolidation”, which is true, but doesn’t take into account that barrier for entry in the space is less on the scale of opening up a sandwich restaurant or boutique clothing store and more on the order of waking up tomorrow and deciding to compete with your local power/ water utility provider.

The answer also gets kind of fuzzy outside the conventional computer space and where single board/ System On a Chip designs are common, stuff like Raspberry Pi’s or smart phones, since they technically have graphics modules designed be companies like Snapdragon or MediaTek. It’s also worth noting that computers have gotten orders of magnitude more complicated compared to the era of starting a tech company in your garage.

If it helps answer your question, according to Wikipedia, most of the other GPU companies have either been acquired, gone bankrupt, or aren’t competing in the Desktop PC market segment.

Pieisawesome@lemmy.dbzer0.com on 02 Sep 13:55 next collapse

If you are really curious, read Chip Wars by Chris Miller.

HobbitFoot@thelemmy.club on 04 Sep 16:45 collapse

In general, tech is an industry with high fixed costs and low costs per unit sold. That kind of pricing structure tends to limit competition.

Nvidia was founded at a time when outsourcing chip fabrication was common and viable, so all Nvidia had to do was focus on design. After a series of failures and near bankruptcy, Nvidia was finally able to invent the idea of a GPU and sell it to the marketplace.

After that Nvidia bought several companies to round out its patent portfolio and capabilities, remaining a dominant company in an industry it created. The only real competition was with other companies that had previous chip design experience.

carl_dungeon@lemmy.world on 01 Sep 20:01 next collapse

Wait till I tell you about the handful of companies that own EVERYTHING else you buy.

Modern_medicine_isnt@lemmy.world on 01 Sep 20:57 next collapse

I’ve noticed how poorly gpus are classified, and how it seems every intersting peice of AI software just has a list of gpus it can work on. So I can see customers just locking into one brand so they have less to memorize.

Part4@infosec.pub on 02 Sep 02:19 next collapse

Huawei have just started selling a gpu to the public that they made a few years back. It has a lot of VRAM, but it is old slow RAM, and doesn’t have the software infrastructure (drivers etc) nvidia has. So currently it isn’t a great option, but if you look at phones, or electric cars, there is every chance they become competitive in a relatively short time period. Time will tell.

zxqwas@lemmy.world on 02 Sep 05:32 next collapse

I’m happy we still have two.

Short answer: hard to start a new company that can compete. Over the decades all the other companies have done poorly and gone bankrupt or been bought out.

nialv7@lemmy.world on 02 Sep 06:54 next collapse

Duopolies are very prevalent in tech, think AMD/Intel, AMD/nvidia, Windows/MacOS, iOS/Android, etc.

As to why? idk. Big companies buy up small ones until one left so they don’t get sued for being a monopoly? Maybe, but I don’t think that applies to all those cases.

scarabic@lemmy.world on 02 Sep 16:40 collapse

In tech it’s often a bad thing to have 37 of something. How many phone operating systems can app developers reasonably serve? Does it benefit consumers to have 19 different graphics chip standards?

bluecat_OwO@lemmy.world on 02 Sep 09:11 next collapse

still very distant in future but I believe in the power of arm. Dedicated gpu’s are beasts now but I am rooting for arm to win this race

mimic_dev@lemmy.world on 02 Sep 14:31 next collapse

One could say it’s an arms race perhaps

HailSeitan@lemmy.world on 02 Sep 15:00 collapse

Never root for a monopoly over at least some semblance of competition

bluecat_OwO@lemmy.world on 02 Sep 17:20 collapse

this is my only gripe with arm but at least I think industry standards are changing and everyone is taking risc seriously so if the giants are noticing it, the monopoly can easily be avoided

lemming741@lemmy.world on 02 Sep 10:42 next collapse

You’ve watched Google,Facebook, and apple do it the last 20 years. If a good idea is spotted early enough, they buy the whole company before they can make it to market and grow to become a threat. It happens in any emerging tech and you’re watching it happen now in the LLM space. Companies burn cash, waiting for their competitors to make a mistake or run out of money. Then they buy out the struggling company, absorbing any tech they might have, maybe some branding, but more importantly- their customers. Now they can jack up prices once market forces are eliminated.
If not for the threat of anti-trust laws, you would see single company rule in every single sector. That is the end goal of a company- a monopoly that crushes potential competition and squeezes consumers.
Railroads, telephone, petroleum, internet, airlines, all ended up as regional monopolies.

brucethemoose@lemmy.world on 02 Sep 14:34 next collapse

It’s even better than that:

They all come from Taiwan Semiconductor (TSMC).

There used to be more of a split between many fabs. Then it was TSMC/Global Foundries/Samsung Foundry. Then it was TSMC/Samsung Foundry. Now AFAIK all GPUs are TSMC, with Nvidia’s RTX 3000 series (excluding the A100) being the last Samsung chip. Even Intel fabs Arc there, as far as I know.

Hopefully Intel won’t kill Arc, as they are planning to move it back to their fabs.

brucethemoose@lemmy.world on 02 Sep 14:39 next collapse

Oh, and there are other graphics makers that could theoretically work on linux, like Imagination’s PowerVR, and some Chinese startups. Qualcomm’s already trying to push into laptops with Adreno (which has roots in AMD/ATI, hence ‘Adreno’ is an anagram for ‘Radeon’)

The problem is making a desktop-sized GPU has a massive capital cost (over $1,000,000,000, maybe even tens of billions these days) just to ‘tape out’ a single chip, much less a line, and AMD/Nvidia are just so far ahead in terms of architecture. It’s basically uneconomical to catch up without a massive geopolitical motivation like there is in China.

HailSeitan@lemmy.world on 02 Sep 15:09 next collapse

Matt Stoller had a nice writeup recently in his monopoly newsletter BIG about how we got into the current mess. TL;DR: basically financialization (prioritizing stock price over innovation, like at Boeing) and a lack of antitrust enforcement as a previously competitive market got monopolized (see chart below)

<img alt="" src="https://lemmy.world/pictrs/image/f23a0cc7-ac46-418d-b986-ebf2a3b77b6c.jpeg">

SmoothIsFast@lemmy.world on 04 Sep 06:54 collapse

Interesting to note that most of those are not chip makers but fabless semi conductor companies who outsource all of their production to global foundries and tscm.

SoftestSapphic@lemmy.world on 02 Sep 18:15 next collapse

Capitalism stifles innovation and suppresses competition.

FreedomAdvocate@lemmy.net.au on 04 Sep 06:56 collapse

It’s insanely costly, insanely hard, and an insanely hard market for a new competitor to enter. Even something as “simple” as dev support is a gigantic hurdle.