GitHub is introducing rate limits for unauthenticated pulls, API calls, and web access (github.blog)
from chaospatterns@lemmy.world to programming@programming.dev on 14 May 23:21
https://lemmy.world/post/29668553

An update from GitHub: github.com/orgs/community/discussions/159123#disc…

The rates are here: docs.github.com/…/rate-limits-for-the-rest-api?ap…

#programming

threaded - newest

traches@sh.itjust.works on 14 May 23:27 next collapse

Probably getting hammered by ai scrapers

adarza@lemmy.ca on 15 May 01:11 next collapse

you mean, doin’ what microsoft and their ai ‘partners’ do to others?

Also, as Microsoft appears to have recognized scraping for AI training as a problem, are you seizing your own scraping activities on public code and the larger web or is this a case of double standards?

rickyrigatoni@lemm.ee on 15 May 12:33 collapse

Yeah but they’re allowed to do it because they have brazillions of dollars.

Ugurcan@lemmy.world on 15 May 18:57 collapse

They literally own GitHub. Brazillions well spent.

potatopotato@sh.itjust.works on 15 May 06:14 next collapse

Everything seems to be. There was a period where you could kinda have a sane experience browsing over a VPN or otherwise using a cloud service IP range endpoint but especially the past 6 months or so things have gotten worse exponentially by the week. Everything is moving behind cloudflare or other systems

db0@lemmy.dbzer0.com on 15 May 06:48 collapse

The funny thing is that rate limits won’t help them with genai scrapers

bigkahuna1986@lemmy.ml on 14 May 23:59 next collapse

Just browsing GitHub I’ve got this limit

k_rol@lemmy.ca on 15 May 00:40 next collapse

Just browse authenticated, you won’t have that issue.

adarza@lemmy.ca on 15 May 01:04 collapse

that is not an acceptable ‘solution’ and opens up an entirely different and more significant can o’ worms instead.

adarza@lemmy.ca on 15 May 01:01 next collapse

i’ve hit it many times so far… even as quick as the second page view (first internal link clicked) after more than a day or two since the last visit (yes, even with cleaned browser data or private window).

it’s fucking stupid how quick they are to throw up a roadblock.

Xanza@lemm.ee on 15 May 01:09 collapse

Then login.

tal@lemmy.today on 15 May 00:03 next collapse

60 req/hour for unauthenticated users

That’s low enough that it may cause problems for a lot of infrastructure. Like, I’m pretty sure that the MELPA emacs package repository builds out of git, and a lot of that is on github.

NotSteve_@lemmy.ca on 15 May 00:23 next collapse

Do you think any infrastructure is pulling that often while unauthenticated? It seems like an easy fix either way (in my admittedly non devops opinion)

Semi_Hemi_Demigod@lemmy.world on 15 May 01:06 next collapse

If I’m using Ansible or something to pull images it might get that high.

Of course the fix is to pull it once and copy the files over, but I could see this breaking prod for folks who didn’t write it that way in the first place

Ephera@lemmy.ml on 15 May 04:09 collapse

It’s gonna be problematic in particular for organisations with larger offices. If you’ve got hundreds of devs/sysadmins under the same public IP address, those 60 requests/hour are shared between them.

Basically, I expect unauthenticated pulls to not anymore be possible at my day job, which means repos hosted on GitHub become a pain.

lazynooblet@lazysoci.al on 15 May 06:44 next collapse

Same problem for CGNAT users

NotSteve_@lemmy.ca on 15 May 12:11 next collapse

Ah yeah that’s right, I didn’t consider large offices. I can definitely see how that’d be a problem

timbuck2themoon@sh.itjust.works on 15 May 12:19 collapse

Quite frankly, companies shouldn’t be pulling Willy nilly from github or npm, etc anyway. It’s trivial to set up something to cache repos or artifacts, etc. Plus it guards against being down when github is down, etc.

Ephera@lemmy.ml on 15 May 23:34 collapse

It’s easy to set up a cache, but what’s hard is convincing your devs to use it.

Mainly because, well, it generally works without configuring the cache in your build pipeline, as you’ll almost always need some solution for accessing the internet anyways.

But there’s other reasons, too. You need authentication or a VPN for accessing a cache like that. Authentications means you have to deal with credentials, which is a pain. VPN means it’s likely slower than downloading directly from the internet, at least while you’re working from home.

Well, and it’s also just yet another moving part in your build pipeline. If that cache is ever down or broken or inaccessible from certain build infrastructure, chances are it will get removed from affected build pipelines and those devs are unlikely to come back.


Having said that, of course, GitHub is promoting caches quite heavily here. This might make it actually worth using for the individual devs.

Xanza@lemm.ee on 15 May 01:08 next collapse

That’s low enough that it may cause problems for a lot of infrastructure.

Likely the point. If you need more, get an API key.

lolcatnip@reddthat.com on 15 May 17:54 collapse

Or just make authenticated requests. I’d expect that to be well within with capabilities of anyone using MELPA, and 5000 requests per hour shouldn’t pose any difficulty considering MELPA only has about 6000 total packages.

Xanza@lemm.ee on 15 May 21:13 collapse

This is my opinion on it, too. Everyone is crying about the death of Github when they’re just cutting back on unauthenticated requests to curb abuse… lol seems pretty standard practice to me.

hinterlufer@lemmy.world on 15 May 05:19 collapse

I didn’t think of that - also for nvim you typically pull plugins from git repositories

hackeryarn@lemmy.world on 15 May 00:22 next collapse

If Microsoft knows how to do one thing well, it’s killing a successful product.

Semi_Hemi_Demigod@lemmy.world on 15 May 01:06 next collapse

RIP Skype

adarza@lemmy.ca on 15 May 01:15 collapse

we could have had bob or clippy instead of ‘cortana’ or ‘copilot’

Gork@lemm.ee on 15 May 01:28 next collapse

Microsoft really should have just leaned into it and named it Clippy again.

Semi_Hemi_Demigod@lemmy.world on 15 May 16:47 collapse

If Cortana was named Bob I don’t think people would have less of a problem with it

henfredemars@infosec.pub on 15 May 01:21 collapse

I came here looking for this comment. They bought the service to destroy it. It’s kind of their thing.

douglasg14b@lemmy.world on 15 May 07:55 collapse

Github has literally never been doing better. What are you talking about??

MITM0@lemmy.world on 15 May 09:04 collapse

We are talking about EEE

lolcatnip@reddthat.com on 15 May 17:48 collapse

What has Microsoft extinguished lately? I’m not a fan of Microsoft, but I think EEE is a silly thing to reference because it’s a strategy that worked for a little while in the 90s that Microsoft gave up on a long time ago because it doesn’t work anymore.

Like, what would be the purpose of them buying GitHub just to destroy it? And if that was their goal, why haven’t they done it already? Microsoft is interested in one thing: making money. They’ll do evil things to make money, just like any other big corporation, but they don’t do evil things just for the sake of being evil. It’s very much in their business interest to be seen as trustworthy, and being overly evil runs counter to that need.

MITM0@lemmy.world on 16 May 02:17 collapse

It’s a slow process

sturlabragason@lemmy.world on 15 May 00:24 next collapse

No no, no no no no, no no no no, no no there’s no limit

forgejo.org

Xanza@lemm.ee on 15 May 01:10 next collapse

Until there will be.

I think people are grossly underestimating the sheer size and significance of the issue at hand. Forgejo will very likely eventually get to the same point Github is at right now, and will have to employ some of the same safeguards.

FlexibleToast@lemmy.world on 15 May 02:00 collapse

Except Forgejo is open source and you can run your own instance of it. I do, and it’s great.

Xanza@lemm.ee on 15 May 04:10 collapse

That’s a very accurate statement which has absolutely nothing to do with what I’ve said. Fact of the matter stands, is that those who generally seek to use a Github alternative do so because they dislike Microsoft or closed source platforms. Which is great, but those platforms with hosted instances see an overwhelmingly significant portion of users who visit because they choose not to selfhost. It’s a lifecycle.

  1. Create cool software for free
  2. Cool software gets popular
  3. Release new features and improve free software
  4. Lots of users use your cool software
  5. Running software becomes expensive, monetize
  6. Software becomes even more popular, single stream monetization no longer possible
  7. Monetize more
  8. Get more popular
  9. Monetize more

By step 30 you’re selling everyone’s data and pushing resource restrictions because it’s expensive to run a popular service that’s generally free. That doesn’t change simply because people can selfhost if they want.

FlexibleToast@lemmy.world on 15 May 10:21 collapse

To me, this reads strongly like someone who is confidently incorrect. Your starting premise is incorrect. You are claiming Forgejo will do this. Forgejo is nothing but an open source project designed to self host. If you were making this claim about Codeberg, the project’s hosted version, then your starting premise would be correct. Obviously, they monetize Codeberg because they’re providing a service. That monetization feeds Forgejo development. They could also sell official support for people hosting their own instances of Forgejo. This is a very common thing that open source companies do…

lolcatnip@reddthat.com on 15 May 17:59 next collapse

It just sounds like they didn’t understand the relationship between Forgejo and Codeberg. I didn’t either into I looked it up just now. IMHO their comment is best interpreted as being about Codeberg. People running their own instances of Forgejo are tangential to the topic at hand.

FlexibleToast@lemmy.world on 15 May 18:59 collapse

Either way, their comment is out of place. A Codeberg comment when the original comment was pointing people to Forgejo.

Xanza@lemm.ee on 15 May 21:29 collapse

Obviously, they monetize Codeberg because they’re providing a service. That monetization feeds Forgejo development. They could also sell official support for people hosting their own instances of Forgejo. This is a very common thing that open source companies do…

This is literally what I said in my original post. Free products must monetize, as they get larger they have to continue to monetize more and more because development and infrastructure costs continue to climb…and you budged in as if this somehow doesn’t apply to Forgejo and then literally listed examples of why it does. I mean, Jesus my guy.

You are claiming Forgejo will do this.

I’m claiming that it is a virtual certainty of the age of technology that we live in that popular free products (like Github) eventually balloon into sizes which are unmanageable while maintaining a completely free model (especially without restriction), which then proceed to get even more popular at which time they have to find new revenue streams or die.

It’s what’s happened with Microsoft, Apple, Netflix, Hulu, Amazon Prime, Amazon Prime Video, Discord, Reddit, Emby, MongoDB, just about any CMS CRM or forum software, and is currently happening to Plex, I mean the list is quite literally endless. You could list any large software company that provides a free or mostly free product and you’ll find a commercial product that they use to fund future development because their products become so popular and so difficult/costly to maintain they were forced into a monetization model to continue development.

Why you think Forgejo is the only exception to this natural evolution is beyond my understanding.

I’m fully aware of the difference between Codeberg and Forgejo. And Forgejo is a product and its exceptionally costly to build and maintain. Costs which will continue to rise as it has to change over time to suit more and more user needs. People seem to heavily imply that free products cost nothing to build, which is just insane.

I’ve been a FOSS developer for 25 years and a tech PM for almost 20. I speak with a little bit of authority here because it’s my literal wheelhouse.

FlexibleToast@lemmy.world on 16 May 01:46 collapse

That’s a huge wall of text to still entirely miss the point. Forgejo is NOT a free service. It is an open-source project that you can host yourself. Do you know what will happen if Forgejo ends up enshitifying? They’ll get forked. Why do I expect that? Because that’s literally how Forgejo was created. It forked Gitea. Why don’t I think that will happen any time soon? It has massive community buy-in, including the Fedora Project. You being a PM explains a lot about being confidently incorrect.

Xanza@lemm.ee on 16 May 20:10 collapse

That’s a huge wall of text to still entirely miss the point.

So then it makes sense that you didn’t read it where I very specifically and intentionally touch the subjects you speak about.

If you’re not going to read what people reply, then don’t even bother throwing your opinion around. Just makes you look like an idiot tbh.

ABetterTomorrow@lemm.ee on 15 May 01:51 next collapse

Dude, this is cool!

mesamunefire@piefed.social on 15 May 02:05 collapse

It works really well too. I have an instance.

yo_scottie_oh@lemmy.ml on 15 May 03:12 next collapse

No, no limits, we’ll reach for the skyyyy

furikuri@programming.dev on 16 May 02:49 collapse

Amazon’s AI crawler is making my git server unstable

End of the day someone still has to pay for those requests

theunknownmuncher@lemmy.world on 15 May 00:34 next collapse

LOL!!! RIP GitHub

EDIT: trying to compile any projects from source that use git submodules will be interesting. eg ROCm has more than 60 submodules to pull in 💀

sxan@midwest.social on 15 May 01:13 collapse

The Go module system pulls dependencies from their sources. This should be interesting.

Even if you host your project on a different provider, many libraries are on github. All those unauthenticated Arch users trying to install Go-based software that pulls dependencies from github.

How does the Rust module system work? How does pip?

adarza@lemmy.ca on 15 May 01:20 next collapse

already not looking forward to the next updates on a few systems.

mesamunefire@piefed.social on 15 May 02:06 collapse

Yeah this could very well kill some package managers. Without some real hard heavy lifting.

irelephant@programming.dev on 15 May 06:22 collapse

scoop relies on git repos to work (scoop.sh - windows package manager)

mesamunefire@piefed.social on 15 May 16:11 collapse

Rip

Ephera@lemmy.ml on 15 May 04:34 next collapse

For Rust, as I understand, crates.io hosts a copy of the source code. It is possible to specify a Git repository directly as a dependency, but apparently, you cannot do that if you publish to crates.io.

So, it will cause pain for some devs, but the ecosystem at large shouldn’t implode.

sxan@midwest.social on 15 May 18:23 collapse

I should know this, but I think Go’s module metadata server also caches, and the compiler(s) looks there first if you don’t override it. I remember Drew got pissed at Go because the package server was pounding on sr.ht for version information; I really should look into those details. It Just Works™, so I’ve never bothered to read up about how I works. A lamentable oversight I’ll have to correct with this new rate limit. It might be no issue after all.

Ephera@lemmy.ml on 15 May 23:14 collapse

I also remember there being a tiny shitstorm when Google started proxying package manager requests through their own servers, maybe two years ago or so. I don’t know what happened with that, though, or if it’s actually relevant here…

UnityDevice@lemmy.zip on 15 May 10:03 collapse

Compiling any larger go application would hit this limit almost immediately. For example, podman is written in go and has around 70 dependencies, or about 200 when including transitive dependencies. Not all the depends are hosted on GitHub, but the vast majority are. That means that with a limit of 60 request per hour it would take you 3 hours to build podman on a new machine.

bkhl@social.sdfeu.org on 15 May 11:06 collapse

@UnityDevice @sxan it doesn't apply in that particular case since in Go you'll by default download those modules through proxy.golang.org

UnityDevice@lemmy.zip on 16 May 09:00 collapse

Oh, that’s nice, TIL. But still, there are other projects that do just directly download from GitHub when building, buildroot for example.

blaue_Fledermaus@mstdn.io on 15 May 00:37 next collapse

The numbers actually seem reasonable...

henfredemars@infosec.pub on 15 May 01:22 next collapse

Not at all if you’re a software developer, which is the whole point of the service. Automated requests from their own tools can easily punch through this building a large project even one time.

douglasg14b@lemmy.world on 15 May 07:59 collapse

60 requests

Per hour

How is that reasonable??

You can hit the limits by just browsing GitHub for 15 minutes.

Sunshine@lemmy.ca on 15 May 01:33 next collapse

!codeberg@programming.dev

XM34@feddit.org on 15 May 07:45 collapse

Codeberg has used way stricter rate limiting since pretty much forever. Nice thought, but Codeberg will not solve this problem, like at all.

onlinepersona@programming.dev on 15 May 09:54 collapse

What? I have never seen a rate limiting screen on codeberg. Ever. If I click too much on github I get rate limited. It happens so frequently, I use sourcegraph.com/search when I have to navigate a repository’s code.

Anti Commercial-AI license

atzanteol@sh.itjust.works on 15 May 01:42 next collapse

The enshittification begins (continues?)…

kixik@lemmy.ml on 15 May 03:49 collapse

just now? :)

ArsonButCute@lemmy.dbzer0.com on 15 May 01:55 next collapse

THIS is why I clone all my commonly used Repos to my personal gitea instance.

SmoothLiquidation@lemmy.world on 15 May 03:28 next collapse

I recently switched my instance from gitea to forgejo because everyone said to do it and it was easy to do.

lazynooblet@lazysoci.al on 15 May 06:46 collapse

What were the benefits

emmanuel_car@fedia.io on 15 May 07:07 collapse

Mostly people stopped telling them to do it, I guess 🤷‍♂️

douglasg14b@lemmy.world on 15 May 07:58 collapse

That’s actually kind of an interesting idea.

Is there a reasonable way that I could host my own ui that will keep various repos. I care about cloned and always up to date automatically?

ArsonButCute@lemmy.dbzer0.com on 15 May 14:36 collapse

Afict, you should be able to follow the instructions for migrating the repo and it will clone it to your instance and track for updates. It’s been a minute since I’ve read up on it though

timewarp@lemmy.world on 15 May 02:04 next collapse

Crazy how many people think this is okay, yet left Reddit cause of their API shenanigans. GitHub is already halfway to requiring signing in to view anything like Twitter (X).

plz1@lemmy.world on 15 May 03:47 next collapse

They make you sign in to use search, on code anyways.

goferking0@lemmy.sdf.org on 15 May 12:46 collapse

Which i hate so much anytime i want to quickly look for something

calcopiritus@lemmy.world on 16 May 11:56 collapse

It’s not the same making API costs unbearable for a social media user and limiting the rate non-logged-in users.

You can still use GitHub without being logged in. You can still use GitHub without almost any limit on a free account.

You cannot even use reddit on a third party app with an account with reddit gold.

kevin____@lemm.ee on 15 May 02:55 next collapse

Good thing git is “federated” by default.

MITM0@lemmy.world on 15 May 09:05 collapse

& then you have fossil which is github in a box

plz1@lemmy.world on 15 May 03:50 next collapse

This is specific to the GH REST API I think, not operations like doing a git clone to copy a repo to local machine, etc.

Kissaki@programming.dev on 15 May 05:54 next collapse

These changes will apply to operations like cloning repositories over HTTPS, anonymously interacting with our REST APIs, and downloading files from raw.githubusercontent.com.

tauren@lemm.ee on 15 May 05:58 collapse

These changes will apply to operations like cloning repositories over HTTPS, anonymously interacting with our REST APIs, and downloading files from raw.githubusercontent.com.

irelephant@programming.dev on 15 May 06:18 collapse

downloading files from raw.githubusercontent.com

oh fuck, this is going to break stuff.

cupcakezealot@lemmy.blahaj.zone on 15 May 04:01 next collapse

is authenticated like when you use a private key with git clone? stupid question i know

also this might be terrible if you subscribe to filter lists on raw github in ublock or adguard

chaospatterns@lemmy.world on 15 May 06:15 collapse

is authenticated like when you use a private key with git clone

Yes

also this might be terrible if you subscribe to filter lists on raw github in ublock or adguard

Yes exactly why this is actually quite problematic. There’s a lot of HTTPS Git pull remotes around and random software that uses raw.githubusercontent.com to fetch data. All of that is now subject to the 60 req/hr limit and not all of it will be easy to fix.

ozoned@piefed.social on 15 May 04:42 next collapse

Wow so surprising, never saw this coming, this is my surprised face. :-l

irelephant@programming.dev on 15 May 06:17 next collapse

Its always blocked me from searching in firefox when I’m logged out for some reason.

mr_satan@lemm.ee on 15 May 06:19 next collapse

That’ just how the cookie crumbles.

onlinepersona@programming.dev on 15 May 10:02 next collapse

I see the “just create an account” and “just login” crowd have joined the discussion. Some people will defend a monopolist no matter what. If github introduced ID checks à la Google or required a Microsoft account to login, they’d just shrug and go “create a Microsoft account then, stop bitching”. They don’t realise they are being boiled and don’t care. Consoomer behaviour.

Anti Commercial-AI license

calcopiritus@lemmy.world on 16 May 11:25 collapse

Or we just realize that GitHub without logging in is a service we are getting for free. And when there’s something free, there’s someone trying to exploit it. Using GitHub while logged in is also free and has none of these limits, while allowing them to much easier block exploiters.

onlinepersona@programming.dev on 16 May 11:40 collapse

I would like to remind you that you are arguing for a monopolist. I’d agree with you if it were for a startup or mid-sized company that had lots of competition and was providing a good product being abused by competitors or users. But Github has a quasi-monopoly, is owned by a monopolist that is part of the reason other websites are being bombarded by requests (aka, they are part of the problem), and you are sitting here arguing that more people should join the monopoly because of an issue they created.

Can you see the flaws in reasoning in your statements?

Anti Commercial-AI license

calcopiritus@lemmy.world on 16 May 13:12 collapse

No. I cannot find the flaws in my reasoning. Because you are not attacking my reasoning, you are saying that i am on the side of the bad people, and the bad people are bad, and you are opposed to the bad people, therefore you are right.

The world is more than black or white. GitHub rate-limiting non-logged-in users makes sense, and is the expected result in the age of web scrapping LLM training.

Yes, the parent company of GitHub also does web scrapped for the purpose of training LLMs. I don’t see what that has to do with defending themselves from other scrappers.

onlinepersona@programming.dev on 16 May 13:31 collapse

Company creates problem. Requires users to change because of created problem. You defend company creating problem.

That’s the logical flaw.

If you see no flaws in defending a monopolist, well, you cannot be helped then.

Anti Commercial-AI license

calcopiritus@lemmy.world on 16 May 19:21 collapse

I don’t think Microsoft invented scrapping. Or LLM training.

Also, GitHub doesn’t have an issue with Microsoft scraping its data. They can just directly access whatever data they want. And rate-limiting non logged in accounts won’t affect Microsoft’s LLM training at all.

I’m not defending a monopolist because of monopolist actions. First of all because GitHub doesn’t have any kind of monopoly. There are plenty of git forges. And second of all. How does this make their position on the market stronger? If anything, it makes it weaker.

daniskarma@lemmy.dbzer0.com on 15 May 10:27 next collapse

Open source repositories should rely on p2p. Torrenting repos is the way I think.

Not only for this. At any point m$ could take down your repo if they or their investors don’t like it.

I wonder if it would already exist and if it could work with git?

Kuinox@lemmy.world on 15 May 10:31 next collapse

Torrenting doesn’t deal well with updating files.
And you have another problem: how do you handle bad actors spamming the download ?
That’s probably why github does that.

daniskarma@lemmy.dbzer0.com on 15 May 10:35 collapse

That’s true. I didn’t think of that.

IPFS supposedly works fine with updating shares. But I don’t want to get closer to that project as they had fallen into cryptoscam territory.

I’m currently reading about “radicle” let’s see what the propose.

I don’t get the bad actors spamming the download. Like downloading too much? Torrent leechers?

EDIT: Just finished by search sbout radicle. They of course have relations with a cryptomscam. Obviously… ;_; why this keep happening?

Jakeroxs@sh.itjust.works on 16 May 04:07 collapse

There’s literally nothing about crypto in radicle from my reading, cryptography and crypto currency are not synonymous.

Ah because they also have a different project for a crypto payment platform for funding open source development.

Edit again: it seems pretty nifty actually, why do you think it’s a scam? Just because crypto?

samc@feddit.uk on 15 May 10:36 next collapse

The project’s official repo should probably exist in a single location so that there is an authoritative version. At that point p2p is only necessary if traffic for the source code is getting too expensive for the project.

Personally I think the source hut model is closest to the ideal set up for OSS projects. Though I use Codeberg for my personal stuff because I’m cheap and lazy

daniskarma@lemmy.dbzer0.com on 15 May 10:49 collapse

I’m wary of external dependencies. They are cool now, but will they be cool in the future? Will they even exist?

One thing I think p2p excels is resiliance. People be still using eDonkey even if it’s abandoned.

A repo signature should deal with “fake copies”. It’s true we have the problem that BitTorrent protocol is not though for updating files, so a different protocol would be needed. I don’t even know how possible/practical it is. It’s true that any big project should probably host their own remote repo, and copy it on other platforms as needed. Github only repos was always a dangerous practice.

samc@feddit.uk on 15 May 12:55 next collapse

If you’re able to easily migrate issues etc to a new instance, then you don’t need to worry about a particular service providers getting shitty. At which point your main concern is temporary outages.

Perhaps this is more of a concern for some projects (e.g. anything that angers Nintendo’s lawyers). But for most, I imagine that the added complexity of distributed p2p hosting would outweigh the upsides.

Not saying it’s a bad idea, in fact I like it a lot, but I can see why it’s not a high priority for most OSS devs

Revan343@lemmy.ca on 15 May 18:07 collapse

It’s true we have the problem that BitTorrent protocol is not though for updating files

Bittorrent v2 has updatable torrents

onlinepersona@programming.dev on 15 May 11:56 next collapse

radicle.xyz

daniskarma@lemmy.dbzer0.com on 15 May 11:59 collapse

I’ve been reading about it. But at some point I found that the parent organization run a crypto scam. Supposedly is not embedded into the protocol but they also said that the token is used to give rewards withing the protocol. That just made me wary of them.

Though the protocol did seen interesting. It’s MIT licensed I think so I suppose it could just be forked into something crypto free.

onlinepersona@programming.dev on 15 May 14:55 collapse

There’s nothing crypto in the radicle protocol. What I think you’re referring to are “drips” which uses crypto to fund opensource development (I know how terrible). It’s its own protocol built on top of ethereum and is not built into the radicle protocol.

This comes up every time someone mentions radicle and I think it happens because there’s a RAD crypto token and a radicle protocol. Beyond the similar names, it’s like mistaking bees for wasps because they look similar and not bothering to have a closer look.

Drips are funding the development of gitoxide, BTW, which is a Rust reimplementation of git. I wouldn’t start getting suspicious of gitoxide sneaking in a crypto protocol just because it’s funded by crypto. If we attacked everything funded by the things we consider evil, well everything opensource made by GAFAM would have to go: modern video streaming (HLS by Apple), Android (bought by Google), LSPs (popularised and developed by Microsoft), OBS (sponsored by Google through YouTube and by Amazon through Twitch), and much much more.

Anti Commercial-AI license

daniskarma@lemmy.dbzer0.com on 15 May 14:59 collapse

The thing is that the purpose of such a system is to run away from enshitificacion.

If they are so crypto adjacent is like a enshitificacion speedrun.

If I’m going to stay in a platform that just care for the money I might as well stay in corpo platforms. I’m not going to the trouble of changing platform and using new systems to keep getting being used so others can enrich.

Git itself doesn’t have crypto around it. This shouldn’t have either.

And this is not even against crypto as a concept, which is fine by me. It’s against using crypto as a scam to get a quick buck out of people who doesn’t know better.

onlinepersona@programming.dev on 15 May 16:23 collapse

If I’m going to stay in a platform that just care for the money

Where are you getting this information from? How is radicle just caring about money?

I’m not going to the trouble of changing platform and using new systems to keep getting being used so others can enrich.

Who is getting rich and how?

Anti Commercial-AI license

daniskarma@lemmy.dbzer0.com on 15 May 16:52 collapse

Answer to both questions is the crypto scheme they have created. There is no logical explanation to it. We have seen it happen countless times before.

They could ask for crypto donations and that would be totally fine. But they are building a crypto scheme. And crypto schemes are build as pyramid schemes to get money out of vulnerable people. Anyone who make such a thing is not trustable.

onlinepersona@programming.dev on 15 May 17:35 collapse

Who is building a cryptoscheme? Radicle developers aren’t building a cryptoscheme. Again, radicle is not crypto, it’s a decentralised git forge.

Anti Commercial-AI license

daniskarma@lemmy.dbzer0.com on 15 May 19:04 collapse

Same devs of the Rad token which is said by themselves that will be used woth the protocol. You cannot disconnect devs of RAD and devs of radicle because they are the same people. It’s like saying that YouTube have nothing to do with google.

onlinepersona@programming.dev on 15 May 20:52 collapse

Where did they say that RAD the token will be used with radicle the git forge? Please provide a link.

Anti Commercial-AI license

daniskarma@lemmy.dbzer0.com on 15 May 21:04 collapse

gemini.com/…/what-is-radicle-crypto-github-altern…

Do tou think they made a crypto scam with almost the same name just for funsies?

Better by their own words: docs.radworks.org/community/rad-token

$RAD is the native token of the Radworks Network, used as the primary means to coordinate all actors, govern the treasury, and (later this year will) reward infrastructure providers on top of the Radicle network.

onlinepersona@programming.dev on 16 May 06:31 collapse

The Gemini.com article looks like AI slop to me, honestly.

In lieu of traditional client-server architecture, Radicle Link uses a Directed Acyclic Graph (DAG) as the core of its P2P network, a distributed ledger technology similar to blockchain that excels in speed and scalability.

DAGs are a distributed ledger? Wat?

Also if you actually looked at the code of radicle, you wouldn’t find rad tokens, erc-20, or whatever else. If you further looked at the protocols you’d see that they aren’t using a blockchain. Repository ownership is not handled by smart contracts either - it’s all public key cryptography, which (again) is not crypto in the sense you’re talking about.

To be fair, the article is old and describing radicle version 2. You can find the code here, but I can’t find ERC tokens or anything like that in there, which further makes me think the authors of the article are very confused, AI, or misrepresenting the project on purpose. Of course, it’s possible that all references to crypto were removed from the archive, but it would be good to provide a link to that if you found it.

$RAD is the native token of the Radworks Network, used as the primary means to coordinate all actors, govern the treasury, and (later this year will) reward infrastructure providers on top of the Radicle network.

This I didn’t know of. But I’m curious how that will be done. It is not proof of crypto being within the radicle protocol or codebase (because it isn’t, I looked - maybe I missed it, but I’d like proof thereof). It might be put in there in the future but I’m pretty sure they know it would piss off people to do that.

My guess is that theyll do it like IPFS, which I don’t think has crypto with the protocol but has filecoin on top to reward people who pin things in IPFS. But IPFS users can completely ignore filecoin and aren’t required to use it.

Anti Commercial-AI license

daniskarma@lemmy.dbzer0.com on 16 May 07:30 collapse

As I said it’s not that crypto is inside de protocol. Is that the parent company (which is just the same people) made a crypto to be used with the protocol. On top of it. Which takes trust away from me. If they want donations ask for donations. If they want to provide a paid service then do a paid service.

But having a crypto they want to move around feels dishonest to me. People will be pumping the crypto thinking they are making an investment, thinking they are going to earn money or participate in some sort of circular economy but they will loose it, all. And the owners will get all their fiat money as soon as they can. I have seen it happen countless times.

If they do that from the beginning, what will they be doing in the future if the project take off? I’d better not find out.

onlinepersona@programming.dev on 16 May 08:37 collapse

OK I understand your concerns better. Thank you for explaining.

I am less concerned and don’t have such a negative relationship with crypto. As long as it’s not the selling point of something and decoupled from the actual project or product, that’s fine to me. That others don’t feel the same way is understandable.

For me, radicle is the fastest way to get off of github. All my projects are now there and anybody can contribute without signing up to yet another website i.e they don’t need to have a login for each individual forgejo or gitlab instance. One radicle identity is all you need to contribute to a radicle project on any seed node.

If (when?) forgejo finally gets federation, I’d be more open to using it, but at the moment, it barely provides an advantage over radicle.

Anti Commercial-AI license

thenextguy@lemmy.world on 15 May 13:33 next collapse

Git is p2p and distributed from day 1. Github is just a convenient website. If Microsoft takes down your repo, just upload to another system. Nothing but convenience will be lost.

witten@lemmy.world on 15 May 16:00 next collapse

Not entirely true. You lose tickets and PRs in that scenario.

QuazarOmega@lemy.lol on 15 May 19:02 collapse

I’ve heard git-bug a few times for decentralised issue tracking, never tried it but the idea is interesting

witten@lemmy.world on 16 May 02:19 collapse

Yeah, pretty neat!

ethancedwards8@programming.dev on 16 May 19:20 collapse

Look into radicle.xyz

Natanael@infosec.pub on 15 May 18:20 collapse
brachiosaurus@mander.xyz on 15 May 10:31 next collapse

I have a question: why do lemmy dev keep using microsoft github?

napkin2020@sh.itjust.works on 15 May 11:44 collapse

Yeah, shoulda use gitflic.ru

midori_matcha@lemmy.world on 15 May 15:29 next collapse

Github is owned by Microsoft, so don’t worry, it’s going to get worse

Lv_InSaNe_vL@lemmy.world on 15 May 15:39 next collapse

I honestly don’t really see the problem here. This seems to mostly be targeting scrapers.

For unauthenticated users you are limited to public data only and 60 requests per hour, or 30k if you’re using Git LFS. And for authenticated users it’s 60k/hr.

What could you possibly be doing besides scraping that would hit those limits?

chaospatterns@lemmy.world on 15 May 17:27 next collapse

You might behind a shared IP with NAT or CG-NAT that shares that limit with others, or might be fetching files from raw.githubusercontent.com as part of an update system that doesn’t have access to browser credentials, or Git cloning over https:// to avoid having to unlock your SSH key every time, or cloning a Git repo with submodules that separately issue requests. An hour is a long time. Imagine if you let uBlock Origin update filter lists, then you git clone something with a few modules, and so does your coworker and now you’re blocked for an entire hour.

Disregard3145@lemmy.world on 15 May 18:36 next collapse

I hit those many times when signed out just scrolling through the code. The front end must be sending off tonnes of background requests

Lv_InSaNe_vL@lemmy.world on 15 May 18:45 collapse

This doesn’t include any requests from the website itself

MangoPenguin@lemmy.blahaj.zone on 16 May 03:17 collapse

60 requests per hour per IP could easily be hit from say, uBlock origin updating filter lists in a household with 5-10 devices.

DoucheBagMcSwag@lemmy.dbzer0.com on 15 May 16:30 next collapse

This going to fuck over obtanium?

varnia@lemm.ee on 16 May 06:52 next collapse

Good thing I moved all my repos from git[lab|hub] to Codeberg recently.

stinky@redlemmy.com on 16 May 14:59 collapse

did your project manager or client ask for you to move there?

PurpleStephyr@lemmy.blahaj.zone on 16 May 13:56 next collapse

RIP yocto builds

bitwolf@sh.itjust.works on 16 May 15:32 collapse

Maybe charge OpenAI for scrapes instead of screwing over your actual customers.