Rendered at 23:24:52 GMT+0000 (Coordinated Universal Time) with Cloudflare Workers.
noirscape 1 days ago [-]
I don't dislike Codeberg inherently, but it's not a "true" GitHub replacement. It can handle a good chunk of GitHub repositories (namely those for well established FOSS projects looking to have everything a proper capital P project has), but if you're just looking for a generic place to put your code projects that aren't necessarily intended for public release and support (ie. random automation scripts, scraps of concepts that never really got off the ground, things not super cleaned up), they're not really for that - private repositories are discouraged according to their FAQ and are very limited (up to 100mb).
They also don't want to host your homepage, so if GitHub Pages is why you used GitHub, they are not a replacement.
Unfortunately I don't think there's really an answer to that conundrum that doesn't involve just spinning up your own git server and accepting all the operational overhead that comes with it. At least Forgejo (software behind Codeberg) is FOSS, so you can do that and it should cover most of what you need (and while you're in the realm of having a server, a Pages-esque replacement is trivial since you're configuring a webserver anyway.) Maybe Gitlab.com, although I am admittedly unfamiliar with how Gitlab's "main" instance has changed over the years wrt features.
> If you do not contribute to free/libre software (or if it is limited to your personal homepage), and we feel like you only abuse Codeberg for storing your commercial projects or media backups, we might get unhappy about that.
Emphasis mine. This isn't about if it's technically possible (it certainly is), it's whether or not it's allowed by their platform policies.
Their page publishing feature seems more like it's meant for projects and organizations rather than individual people. The way it's described here indicates that using them to host your own blog/portfolio/what have you is considered to be abusing their services.
shimman 1 days ago [-]
Seems fair to me, they're a nonprofit that exists in our lived reality and not an abusive monopolist that can literally throw a billion dollars to subsidize loss leaders.
All it shows the world is why there needs to be a VAT like tax against US digital services to help drive a public option for developers.
There's no reason why the people can't make our own solutions rather than be confined to abusive private US tech platforms.
germandiago 1 days ago [-]
Adding taxes to things does not help anyone and goes against free choice.
A better alternative would be to create the incentives so that companies like these can be born in Europe.
era-epoch 1 days ago [-]
Companies like Microsoft should not be given "incentives to exist" anywhere (at least as they exist currently). They are corrosive to the public good.
cromka 22 hours ago [-]
This doesn't work in when a market is run by oligopolies, you have to regulate to restore some sort of normalcy and competition.
shimman 9 hours ago [-]
Disagree the only alternative is to let the people decide, I don't trust a dozen men that already have deeply undemocratic beliefs to dictate the direction of tech for society.
You are against democracy, I am not. Democracy has led to some of the best advances of civilization, all oligarchies have done is introduce mass poverty, mass misery, and mass death.
At least with democracy we went to the moon for mankind, not shareholders.
24 hours ago [-]
johnisgood 1 days ago [-]
Reading what you quoted, no it is not, as long as you contribute to free software or you have projects that are open source. Not just your personal homepage. If you only have a personal homepage and nothing else that is open source, then they have a problem.
My 2 cents.
noirscape 1 days ago [-]
Which makes it not really a suitable replacement for GitHub, which is my entire point.
Keep in mind, I'm not saying Codeberg is bad, but it's terms of use are pretty clear in the sense that they only really want FOSS and anyone who has something other than FOSS better look elsewhere. GitHub allowed you to basically put up anything that's "yours" and the license wasn't really their concern - that isn't the case with Codeberg. It's not about price or anything either; it'd be fine if the offer was "either give us 5$ for the privilege of private repositories or only publish and contribute public FOSS code" - I'm fine paying cash for that if need be.
One of the big draws of GitHub (and what got me to properly learn git) back in the day with GitHub Pages in particular was "I can write an HTML page, do a git push and anyone can see it". Then you throw on top an SSG (GitHub had out of the box support for Jekyll, but back then you could rig Travis CI up for other page generators if you knew what you were doing), and with a bit of technical knowledge, anyone could host a blog without the full on server stack. Codeberg cannot provide that sort of experience with their current terms of service.
Even sourcehut has, from what I can tell, a more lenient approach to what they provide (and the only reason why I wouldn't recommend sourcehut as a GitHub replacement is because git-by-email isn't really workable for most people anymore). They encourage FOSS licensing, but from what I can tell don't force it in their platform policies. (The only thing they openly ban is cryptocurrency related projects, which seems fair because cryptocurrency is pretty much always associated with platform abuse.)
tjoff 1 days ago [-]
(SSG - static site generator)
I mean, it is arguably much easier to just write the HTML page and upload it with FTP and everyone can see it. I never understood why github became a popular place to host your site in the first place.
noirscape 1 days ago [-]
> I never understood why github became a popular place to host your site in the first place.
Easy: it was free, it was accessible to people that couldn't spend money for a hosting provider (read: high schoolers) and didn't impose arbitrary restrictions on what you were hosting.
Back then, your options as a high school student were basically to either try and reskin a closed off platform as much as you could (Tumblr could do that, but GitHub Pages also released in the time period where platforms were cracking down on all user customization larger than "what is my avatar") or to accept that the site you wanted to publish your stuff on could disappear at any moment the sketchy hosting provider that provided you a small amount of storage determined your bandwidth costs meant upselling you on the premium plan.
GitHub didn't impose those restrictions in exchange for being a bit less interactive when it came to publishing things (so no such thing as a comment section without using Disqus or something like that, and chances are you didn't need the comments anyways so win-win) That's why it got a lot more popular than just using an FTP server.
kbolino 1 days ago [-]
There are multiple reasons why FTP by itself became obsolete. Some of them I can think of off the top of my head:
1) Passive mode. What is it and why do I need it? Well, you see, back in the old days, .... It took way too long for this critical "option" to become well supported and used by default.
2) Text mode. No, I don't want you to corrupt some of my files based on half-baked heuristics about what is and isn't a text file, and it doesn't make any sense to rewrite line endings anymore anyway.
3) Transport security. FTPS should have become the standard decades ago, but it still isn't to this day. If you want to actually transfer files using an FTP-like interface today, you use SFTP, which is a totally different protocol built on SSH.
rkagerer 18 hours ago [-]
Why would you say FTP is obsolete? For what it's worth, I still use it (for bulk file transfer).
joecool1029 17 hours ago [-]
chrome and firefox dropped support for it 5 years or so ago, it has had a lot of security issues over the years, was annoying over NAT, and there are better options for secure bulk transfers (sftp, rsync, etc)
rkagerer 17 hours ago [-]
I see, I assumed by ftp you also meant sftp.
johnmaguire 1 days ago [-]
Because it doesn't require you to run an HTTP server, FTP server, or install an FTP client.
tjoff 1 days ago [-]
Finding an HTTP+FTP server was easier than finding github. Your OS probably has a FTP client installed already, but finding another one is easier than finding and most definitely easier than learning git.
And if you already knew how to write/make HTML you'd for sure already know all of that too.
johnmaguire 1 days ago [-]
This is definitely a matter of perspective. I have had a Github account since 2010, and git comes installed on Linux and macOS.
I don't always have a server available to host an HTTP+FTP server on. Or want to pay for one, or spend time setting one up. I can trust that Github Pages will have reasonable uptime, and I won't have to monitor it at all.
> And if you already knew how to write/make HTML you'd for sure already know all of that too.
This seems unnecessarily aggressive, and I don't really understand where it's coming from.
BTW, you can absolutely host plain HTML with Github Pages. No SSG required.
lazyasciiart 1 days ago [-]
> And if you already knew how to write/make HTML you'd for sure already know all of that too.
That's a completely false statement. My kid took very basic programming classes in school which covered HTML so they could build webpages, which is a fantastic instant-results method. Hooray, now the class is finished, he wants to put it on the web. Just like millions of other kids who couldn't even spell FTP.
lmm 22 hours ago [-]
> Finding an HTTP+FTP server was easier than finding github.
No it wasn't. Seriously, where?
Diti 13 hours ago [-]
Didn’t your ISP provide you with free FTP storage? The French ones did, at least.
lmm 12 hours ago [-]
Maybe decades ago. My current one doesn't.
skydhash 20 hours ago [-]
There was a lot of sites that provided some cpanel-like option as long as you're ok with yourcoolname.weirdhostingname.com. I believe they all came with a filebrowser and the always present public_html folder.
shunia_huang 18 hours ago [-]
It'd be nice to mention some big names here that are capable of:
a) git pull & push for updates
b) good enough CDN distribution, sometimes interactive examples/project page loads tons of files
c) good enough security promises of the entire platform/infra
d) good enough serviceable time, we do not need 99.9999SLA but better not down often
e) have generous free tier
f) great DX & UX, this one is small but small headache adds up quickly
lmm 20 hours ago [-]
There was geocities (now gone) and a couple of *.tk domains that would inject their ads all over your page. Neither makes a great substitute for GitHub pages these days.
kbolino 1 days ago [-]
I touched on the issues with FTP itself in another comment, but who can forget the issues with HTTP+FTP, like: modes (644 or 755? wait, what is a umask?), .htaccess, MIME mappings, "why isn't index.html working?", etc. Every place had a different httpd config and a different person you had to email to (hopefully) get it fixed.
1 days ago [-]
enraged_camel 1 days ago [-]
That FAQ snippet is insane to me. Maybe it's a cultural thing but I'd never do business with a company that has implicit threats in their ToS based on something so completely arbitrary.
0x3f 1 days ago [-]
The worst part is really the unclear procedure. If they set out terms that say they'll give me 4 weeks to migrate projects they don't like off the platform, with n email reminders in between, then that's not ideal but fine. As it is, I'd be worried I'll wake up to data loss if they get 'unhappy'. I have the same problem with sourcehut, actually, with their content policy.
semiquaver 1 days ago [-]
Nonprofit, not a company. as far as I can tell they don’t accept payment for anything so they don’t want your “business”.
jeremyjh 24 hours ago [-]
Which is why they are not a Github replacement for everyone.
shimman 9 hours ago [-]
They're a perfectly fine replacement if you care about FOSS and OSS, unlikely github + msft.
rkagerer 18 hours ago [-]
Well it's kind of describing the reality that exists at other companies today. Most ToS's have clauses where they can kick you off for not using it as intended, solely at their discretion. At least these guys are honest and upfront about it. I do agree though some more guidelines around their policy would be nice.
1 days ago [-]
gzread 19 hours ago [-]
and we feel like you only abuse Codeberg for storing your commercial projects or media backups
Sounds like they're cool with a little personal website.
tbayramov 1 days ago [-]
(Shameless plug)
Hey, I’m building Monohub - as a GitHub alternative, and having private repositories is perhaps a key feature - it started as a place for me to host my own random stuff. Monohub [dot] dev is the URL. It’s quite early in development, so it’s quite rough around the edges. It has PR support though.
Hosted in EU, incorporated in EU.
Would be happy if you tried it out — maybe it’s something for you.
Are you using an existing forge package (like eg Forgejo which codeberg is built on) or something custom?
tbayramov 1 days ago [-]
Custom-built on top of libgit2.
NewJazz 1 days ago [-]
Not trying to be dismissive/snarky... But why?
tbayramov 17 hours ago [-]
I started developing it as a slim wrapper around Git to support my own needs. At the same time, it is essential to have rich features like pull requests/code review, so I started focusing on designing a tool that strikes an appropriate balance between being minimalistic and functional. One thing that I focus on is allowing users to disable any feature they don't need.
WhyNotHugo 1 days ago [-]
Doesn’t .dev belong to Google? An old choice or provider for a EU-focused hosting.
KomoD 1 days ago [-]
And the site also uses Cloudflare (for domain registrar, DNS and CDN):
ipinfo monohub.dev
Core
- IP 188.114.96.1
- Anycast true
- Hostname
- City San Francisco
- Region California
- Country United States (US)
- Currency USD ($)
- Location 37.7621,-122.3971
- Organization AS13335 Cloudflare, Inc.
- Postal 94107
- Timezone America/Los_Angeles
Auth is hosted by Kinde (an Australian company, uses AWS)
It belonged to everyone until Google became the TLD operator.
(Yes, I'm still salty google broke our localhost .dev sites.)
(Yes, I know it was never a reserved TLD.)
NewJazz 1 days ago [-]
Unfortunately I don't think there's really an answer to that conundrum that doesn't involve just spinning up your own git server and accepting all the operational overhead that comes with it.
Hmm all that operational overhead... Of an ssh server? If you literally just want a place to push some code, then that really isn't that hard.
jayd16 1 days ago [-]
Lots and lots of programmers have very little understanding and especially operation knowledge of how to host a public service. You can be an extreme graphics programmer and not know the web stack at all.
And no, its not that hard once you learn. Except, now its a never ending chore when it was an appliance. Instead of a car you have a project car.
zahlman 1 days ago [-]
> Lots and lots of programmers have very little understanding and especially operation knowledge of how to host a public service. You can be an extreme graphics programmer and not know the web stack at all.
Can confirm.
Also, not everyone who wants to share content publicly has a domain name with which to do so, or the kind of Internet connection that allows running a server. If you include "hosting" by using a hosting provider... it's perfectly possible (raises hand) to not even have any experience with that after decades of writing code and being on the Internet. (Unless you count things like, well, GitHub and its services, anyway.)
mememememememo 1 days ago [-]
On the other hand you probably don't need to go full k8s and datadog on it. Just host it. Use a PaaS so you don't need to do Linux admin.
NewJazz 1 days ago [-]
I think both of you are misunderstanding what I proposed. You just need a single VM with an ssh server. Literally no web service needed, if all you want to do is host some code remotely.
jayd16 22 hours ago [-]
I didn't misunderstand. Sshd is a web service. Most folks don't already know how and don't want to set up a machine that is always on, that will restart on power loss, that will have a static IP or dynDNS, with a domain name and proper routing and open ports and certs and enough bandwidth and that's before you even worry about actual security and not just what is needed to work.... It's actually a big annoyance if you don't do it all the time.
skydhash 20 hours ago [-]
ssh isn't a web service (some would argue that smtp and ftp aren't too as they came before the web).
And I believe GP was talking about the only thing you need is:
ssh user@remotehost git init --bare repo.git
And then you can add the remote to your local repo with
git remote add origin user@remotehost:repo.git
Now all you need to do is
git push origin branch_name
Replace origin with another identifier if it's already taken.
mememememememo 15 hours ago [-]
The rest of the owl: go to provider, set up VM (20 questions) log into root. SSH for login. set up firewalls. create non-root user. useradd or adduser? depends if you want a home dir I guess. debug why you can't ssh in. Finally get in. sudo apt update. sudo apt install git (or is it named something else?). install failtoban. install fw.
then do above.
then troubleshoot.
set vm backup policy.
save myriad passwords and secret to bitwarden.
get ubuntu to stay up to date.
NewJazz 8 hours ago [-]
I could write a much more complicated list of steps for github.
mememememememo 23 hours ago [-]
VM and ssh. Needs linux admin exp. Security updates. Understand how to securely connect from an IP without opening 22 on 0.0.0.0/0
NewJazz 8 hours ago [-]
Opening ssh to the internet is fine if you are using key based auth, which is the default on many VM setups.
mejutoco 1 days ago [-]
In case anybody is interested, having a bare git repo on a server is as easy as:
P.S. I know it does not have the same features as github
SpaceNoodled 1 days ago [-]
It has all the same working features as github
stephenr 1 days ago [-]
If it's your ssh server and it's single user you don't need to use the "git@" part at all.
Just store the repo and access it with your account.
The whole git@ thing is because most "forge" software is built around a single dedicated user doing everything, rather than taking advantage of the OS users, permissions and acl system.
For a single user it's pointless. For anyone who knows how to setup filesystem permissions it's not necessary.
mejutoco 1 days ago [-]
I prefer to be explicit about which user is connecting to ssh.
kbolino 1 days ago [-]
There isn't much advantage that can be taken from O/S users and perms anyway, at least as far as git is concerned. When using a shared-filesystem repository over SSH (or NFS etc.), the actually usable access levels are: full, including the abilities to rewrite history, forge commits from other users, and corrupt/erase the repo; read-only; and none.
skydhash 20 hours ago [-]
Git was build to be decentralized with everyone having its own copy. If it's an organization someone trusted will hold the key to the canonical version. If you need to discuss and review patches, you use a communication medium (email, forums, IRC, shared folder,...)
lazyasciiart 1 days ago [-]
But they don't literally just want a place to push some code. The problem statement included "if GitHub Pages is why you used GitHub".
NewJazz 1 days ago [-]
And before that it said...
but if you're just looking for a generic place to put your code projects that aren't necessarily intended for public release and support (ie. random automation scripts, scraps of concepts that never really got off the ground, things not super cleaned up), they're not really for that - private repositories are discouraged according to their FAQ and are very limited (up to 100mb).
Do most vps providers enable password based ssh access on first boot up? That video doesn't seem very relatable.
eddythompson80 22 hours ago [-]
Exactly. It’s much simpler
overfeed 1 days ago [-]
> ...that really isn't that hard.
Until the AI scrapers[1] come for you at 5k requests per second and you're doing operations in hard-mode.
1. Most forges have http pages for discoverability. I suppose one could hypothetically setup an ssh-only forge and statically generate a html site periodically, but this is already advanced ops for the average Github user
NewJazz 1 days ago [-]
I wasn't proposing a full on forge, just a VM with a (key auth only) ssh server to push code to/from.
gzread 19 hours ago [-]
This isn't a real thing and if it ever becomes a thing you can sue them for DDOS and send Sam Altman to jail. AI scraping is in the realm of 1-5 requests per second, not 5000.
reissbaker 1 days ago [-]
FWIW, Pierre's "Code Storage" project [1] seems like it simplifies a lot of the operational overhead of running git servers, if what you want is "an API for git push". Not affiliated with the company (and I haven't tried it myself, so I can't vouch for how well it works), I just think it's a neat idea.
I think "Code Storage" (definitely needs a unique name), is less an API for git push (surely git push is that API?), and more an API for "git init"? It seems to be Git as infrastructure, rather than Git as a product. i.e. if you're using it for a single repo it's probably not a good fit, it's for products that themselves provide git repos.
singpolyma3 22 hours ago [-]
"spinning up your own git server" isn't really a thing. Any server with ssh and git installed is a git server.
shmerl 1 days ago [-]
GitLab is probably a better option for that.
NewJazz 1 days ago [-]
Yeah, but ooh boy is a private gitlab server complicated. Omnibus installation helps manage that, but if you outgrow it you're in for a complicated time.
Also gitlab has cves like every other week... You're going to be on that upgrade train, unless you keep access locked down (no internet access!!) and accept the admittedly lower risk of a vulnerable gitlab server on your LAN/VPN network.
Even if gitlab is updated fully, you're fighting bot crawlers 24/7.
sneak 1 days ago [-]
I love Gitea and self-hosting it has been effortless, even through upgrades.
shlewis 13 hours ago [-]
Forgejo, even. Both awesome, truly capable Github alternatives.
AlienRobot 1 days ago [-]
To me that sounds like Github does too many things, not that Codeberg does too few.
BeetleB 1 days ago [-]
Indeed. A code repository alternative should definitely not require an equivalent to Github Pages.
pyth0 1 days ago [-]
To me, and devs at large (given their market share), that sounds like convenience.
Rapzid 1 days ago [-]
> Moving from GitHub to Codeberg, for lazy people
Step 1.) Stay on GitHub
I think the internet has "GitHub Derangement Syndrome" right now. It's an outlet for people's frustration.
The current trend reminds me a lot of the couple years we had where Game Developers were that outlet. They needed to "Wake up" and not "Go woke, go broke". An incredible amount of online discourse around gaming was hijacked by toxic negativity.
I'm sure every individual has their really good logical reasons, but zooming out I think there is definitely a similar social pathology at play.
bsder 1 days ago [-]
> I think the internet has "GitHub Derangement Syndrome" right now. It's an outlet for people's frustration.
I would argue that the open source people aren't the only ones paying attention right now.
If you are hosting proprietary code on Github, it has become clear that Microsoft is going to feed that into their AI training set. If you don't want that, you don't have a choice but to leave Github.
beanjuiceII 1 days ago [-]
you just disable the setting
tcmart14 19 hours ago [-]
By the same company who admits that disabling telemetry does not in fact disable telemetry and refuse to fix it.
schubidubiduba 1 days ago [-]
...and trust naively that that does anything
righthand 22 hours ago [-]
Dont forget that Github also hosts projects for DHS and ICE, people have a right to be angry and leave. The “one place for code” fantasy is over.
gzread 19 hours ago [-]
Private repositories can be hosted anywhere you have SSH access. Write the URL as username@host:/path
peytongreen_dev 21 hours ago [-]
[dead]
ronsor 1 days ago [-]
The truth is that I publish OSS projects on GitHub because that's where the community is, and the issues/pull requests/discussions are a bonus.
If I just want to host my code, I can self host or use an SSH/SFTP server as a git remote, and that's usually what I do.
embedding-shape 1 days ago [-]
> I publish OSS projects on GitHub because that's where the community is
And so we go, forever in circles, until enough of us move to other platforms regardless of where the existing community is. Just like how GitHub found its community in the early days, when most people (afaik) was using SourceForge, if anything.
"The community" will always remain on GitHub, if everyone just upload code to where "the community" already is. If enough of us stop using GitHub by default, and instead use something else, eventually "the community" will be there too, but it is somewhat of a chicken-and-egg problem, I admit.
I myself workaround this by dropping the whole idea that I'm writing software for others, and I only write it for myself, so if people want it, go to my personal Gitea instance and grab it if you want, I couldn't care less about stars and "publicity" or whatever people nowadays care about. But I'm also lucky enough to already have a network, it might require other's to build their network on GitHub first, then also be able to do something similar, and it'll all work out in the end.
ronsor 1 days ago [-]
> "The community" will always remain on GitHub, if everyone just upload code to where "the community" already is. If enough of us stop using GitHub by default, and instead use something else, eventually "the community" will be there too, but it is somewhat of a chicken-and-egg problem, I admit.
SourceForge was abandoned due to UX issues and the adware debacle; at the same time, GitHub started making changes which made it more viable to use the platform to distribute binary releases.
The deficiencies of GitHub are not critical enough for me to care, and if it ever gets that bad, pushing somewhere else and putting a few "WE HAVE MOVED" links isn't a big deal.
And "the community" isn't moving to Codeberg because Codeberg can't support "the community" without a massive scale up.
mech422 1 days ago [-]
"SourceForge was abandoned due to UX issues and the adware debacle"
I'd say SourceForge was abandoned due to VA Linux going under. I remember the pain/panic as large numbers of OSS projects were suddenly scrambling to find alternatives to SF. I actually started a subscription to GitHub just to try and help ensure they had the money to stay in business, and we didn't have to go thru that again.
zerkten 1 days ago [-]
>> And "the community" isn't moving to Codeberg because Codeberg can't support "the community" without a massive scale up.
People have a superficial knowledge of the space (I think this extends beyond Codeberg) but feel strongly that they need to advocate for something. Codeberg themselves seem to have opinions about what they want to do but people are suggesting they can do more simply because it gives them an outlet.
The constraints that Codeberg set seem to, on the surface at least, ensure they can scale based on their needs and protect them from external threats. Hosting random sites comes with a range of liabilities they probably understand and want to avoid right now. There are EU regulations which can be challenging to handle.
WhyNotHugo 1 days ago [-]
> that's where the community is
The part of the FOSS community that embraces proprietary dependencies are there, but there’s a lot of the community outside of it.
Fortunately, GitHub is pushing hard for folks to want to move away.
ncruces 1 days ago [-]
GitHub also generously gives me a bunch of free CI, in exchange for whatever they benefit from me being there.
It's worth $50 just this month, according to them, but I don't see anyone else offering the mac runners that account for most of it.
For all the complaints, I test my packages that actually need it across dozens of architecture and OS combinations with a mix of runners, nested virtualization and qemu binfmt, all on their free platform.
AbanoubRodolf 20 hours ago [-]
[dead]
JuniperMesos 1 days ago [-]
In particular a number of other projects assume that you have a GitHub account. https://github.com/rust-lang/crates.io/issues/326 has been open for literally a decade without any meaningful work. If you want to publish a Lean software packages on Reservoir, the official Lean package registry, their requirements (https://reservoir.lean-lang.org/inclusion-criteria) not only specify a GitHub project specifically, but having at least two stars on GitHub as a "basic quality filter". Microsoft is a big funder of Lean and I can't help but think this is a deliberate restriction to increase lock-in on a Microsoft-owned platform.
Aldipower 13 hours ago [-]
Which community? Organic traffic to your GitHub exclusively coming from external references and links. There is no reason the same isn't working with Codeberg. If you link to Codeberg instead of GitHub it still works the same.
sc68cal 1 days ago [-]
I have been struggling with this, myself. I used to push everything to GitHub, but a couple months ago I switched over to using my small low-power home server as a Git host. I used to really enjoy the feeling of pushing commits up to GitHub, and that little dopamine rush hasn't really transferred to my home machine yet.
It's a shame. The people who control the money successfully committed enshittification against open source.
ecliptik 1 days ago [-]
If you're running forgejo you can setup mirrors [1].
Mine syncs a few repos to github to make them public, and planning to add Codeberg in as well ala POSSE [2].
I mean, more around the reluctance to publish any more code publicly since it just gets sucked up by companies to train their models.
LinXitoW 1 days ago [-]
Considering that "the community" is now filled with vibe coding slop pull requesters, and non-coders bitching in issues, the filter that not-github provides becomes better and better.
Of course, that mostly goes for projects big enough to already have an indepedent community.
goku12 1 days ago [-]
Not to contradict you, but there's another important aspect to 'community' besides the bad contributors and the entitled complainers. That's discoverability. How do you discover a project that may be hosted anywhere on the dozens of independent forges out there? Searching each one individually is not a viable proposition. The search often ends on the biggest platform - Github.
I'm not trying defend github here. The largest platform could have been anyone who took advantage of the early opportunities in the space, which just happens to be Github. But discoverability is still a nagging problem. I don't think that even a federated system (using activitypub, atproto or whatever else out there) is going to solve that problem. We need a solution that can scour the entire code hosting space like search engines do (but collaboratively, not aggressively like LLM scrapers).
andybak 1 days ago [-]
Ideally this should be something search engines handle - but they do a poor job in specialised areas like code repos.
It's helpful to have a github mirror of your "real" repo (or even just a stub pointing to the real repo if you object to github strongly enough that mirroring there is objectionable to you).
One day maybe there will be an aggregator that indexes repos hosted anywhere. But in many ways that will be back to the square one - a single point of failure.
The Fediverse seems to dislike global search. Or is that just a mastodon thing?
goku12 16 hours ago [-]
> The Fediverse seems to dislike global search. Or is that just a mastodon thing?
Lemmy seems to do a decent enough job at global searches. It's most likely just a mastodon problem.
kelnos 4 hours ago [-]
I don't think I ever find new software through github's own search. I find them through the software's website or some other means like a search engine.
throwaway28731 1 days ago [-]
> How do you discover a project that may be hosted anywhere on the dozens of independent forges out there?
Word of mouth. Package managers. Search engines. Your LLM of choice.
Does anyone seriously use GitHub search to discover new projects?
goku12 16 hours ago [-]
Word of mouth: What if it's just some random script a guy created in a weekend? I have my code used by others in such a manner.
Package managers: Same problem as above. You missed the point of free software.
Search engines: They do a disastrous job of indexing anything on a forge. You might as well yell at the clouds instead.
LLM of choice: I'm not taking this seriously.
> Does anyone seriously use GitHub search to discover new projects?
I don't even understand the point of such questions. None of the solutions you proposed are any better solving what I described than the insufficient method I wrote about.
skydhash 20 hours ago [-]
That was solved by forums, tech mailing lists,... If you were interested in something, you hang around the communities and almost everything that was interesting enough will pass by.
goku12 16 hours ago [-]
Do you hang around every forum or mailing list that discusses the solutions to problems that you may potentially encounter in the future? The type of problem I'm talking about isn't one that can be foreseen years in advance.
cdrnsf 1 days ago [-]
I've been using a self-hosted forgejo (which Codeberg uses and maintains) instance for all of my non-work projects and it's been great. I don't miss GitHub at all. I also keep it accessible only from Tailscale so that AI crawlers and such can speedily make their way into the sun.
huijzer 1 days ago [-]
I have moved to self-host Forgejo a few years ago and I can also highly recommend. It's working great. I have posted a tutorial [1] (verified last month that it still works), and recently moved from Hetzner to 2 Raspberry Pi's for hosting the server and the runner [2]. It's great. Really rock solid. Has been more reliable and faster than GitHub.
I was self hosting gitlab for a long time. But forgejo is an order of magnitude less resource intensive.
It is a single very small go binary. You can use sqlite or postgres. But you can easily run it inside a small docker container on your local machine.
And it is fun to hack on it because it is so open. You build really fun workflows that are blocked by the corporate limits of Github.
poorman 1 days ago [-]
Same. I installed Forgejo two months ago when Github wouldn't let me create agent accounts. It's been awesome. Any time I want a new feature I open my agent on the server and tell it to add the feature to Forgejo. Took all of 15 minutes for it to add a working Show/Hide "Viewed" files on the PR reviews.
huijzer 1 days ago [-]
You mean you upstream those changes or are you running your own fork?
midasz 1 days ago [-]
Same - also installed a forgejo runner via docker so i've got CI. Forgejo has it's own artifactory/registry so the apps I make get a docker image and I just run that docker image. All on my own hardware.
kroaton 1 days ago [-]
I did the same a few months ago when I read that multiple big OSS Linux projects were moving to it and it's been phenomenal so far.
dml2135 1 days ago [-]
Same, I've been enjoying it a ton. Recently, with the help of Claude, I've used it to set up an entire CI/CD pipeline for my home server. The flow is roughly:
Build Nix config into a VM image => Deploy VM to Proxmox via its API => Spin up Docker stack via Komodo
I've also trying to use it to sync my Obsidian vault via git to my phone, altho that flaked out on me recently (if anyone knows a reliable way to use git via the shell on iOS, please let me know).
eblume 1 days ago [-]
Same! I've also recently exposed mine to the internet through a fly.io proxy, though. So far, no issues, but I'm keeping a close eye.
alargemoose 1 days ago [-]
I went with gitea, but for the same general reasons. I like It has the option to mirror repos up to GitHub for the stuff I actually want to share with the world. Is there anything that made you choose forgejo specifically? I’m not eager to move platforms, but I know there’s more options that have popped up in the years since I first stood up my gitea instance.
11 hours ago [-]
DavideNL 1 days ago [-]
Same, Gitea (in Docker) has been running very smoothly for years. Havent had a single issue.
Also now wondering about differences with forgejo.
woodruffw 1 days ago [-]
I think evaluating alternatives to GitHub is going to become increasingly important over the coming years. At the same time, I think these kinds of migrations discount how much GitHub has changed the table stakes/raised the bar for what makes a valuable source forge: it's simply no longer reasonable to BYO CI or accept one that can't natively build for a common set of end-user architectures.
This on its own makes me pretty bearish on community-driven attempts to oust GitHub, even if ideologically I'm aligned with them: the real cost (both financial and in terms of complexity) of user expectations around source forges in 2026 is immense.
usrbinenv 1 days ago [-]
I don't understand the hype around CI and that it's supposedly impossible to run something like that without Git, let alone Github. Like sure, a nice interface is fine, but I can do with a simpler one. I don't need a million features, because what is CI (in practice today, not in theory)? It's just a set of commands that run on a remote machine and then the output of those commands is displayed in the browser and it also influences what other commands may or may not run. What exactly is the big deal here? It can probably be built internally if needed and it certainly doesn't need to depend on git so much - git can trigger it via hooks, but that's it?
I think the real problem is we were sold all these complex processes that supposedly deliver better results, while in reality for most people and orgs it's just cargo culting, like with Kubernetes, for example. We can get rid of 90% of them and be just fine. You easily get away without any kind of CI in teams of less than 5-7 people I would argue - just have some sane rules and make everyone follow them (like run unit tests before submitting a PR).
lmm 22 hours ago [-]
I find CI very valuable even on my solo projects.
> what is CI (in practice today, not in theory)? It's just a set of commands that run on a remote machine and then the output of those commands is displayed in the browser and it also influences what other commands may or may not run. What exactly is the big deal here?
The key is hermetically/reproducibly - you don't want to run commands on some random machine, you want to run commands on a well-controlled/versioned machine that won't change under you, that you can recreate at will. Which sure you should be able to do with Nix or something, but the kind of person who doesn't want to use CI doesn't want to use Nix either.
And the other key is whose machine? Particularly when it comes to e.g. Macs, which are expensive. Maybe you have a spare server farm in your closet, but most people don't.
kelnos 3 hours ago [-]
These days the reproducibility part is trivial with a Docker container, as much as it's a mess of a technology.
gizzlon 16 hours ago [-]
For a solo dev, what are the advantages of _not_ building on your own machine?
Is the compiling and test running too resource intensive?
Do you build every commit? If so, why?
I see the value in larger teams, but for solo stuff I just find it slow and annoying. I'm using go, and it compiles fast, so that could be a part of it.
lmm 16 hours ago [-]
> For a solo dev, what are the advantages of _not_ building on your own machine?
I end up with all kinds of random crap on my own machine. It's very easy to accidentally e.g. globally install a library that wasn't properly listed in my dependency management. So having a separate standardised/controlled build environment is a good way to catch those. It also helps with flaky tests or random "works on my machine" problems - this way my tests are at least getting run on two quite different machines (different OS/arch/etc.)
skydhash 20 hours ago [-]
> The key is hermetically/reproducibly
Why not use VMs? Libvirt is scriptable enough for that. And LXC/Incus can be used if you want the shorter starting time.
lmm 19 hours ago [-]
Ok, that solves like 20% of the problem. How (and where) are you provisioning these VMs? How are you managing what versions of what are installed on them, and is that process reproducible?
None of this is hard, exactly, but you do have to put in the legwork of doing it, and it's mostly only the big players who've done so.
skydhash 12 hours ago [-]
A lot of VM managers allows to clone from a disk. And some even allows for an overlay layer on top of a read-only disk.
> just have some sane rules and make everyone follow them (like run unit tests before submitting a PR)
and thus you discover the value of CI
IshKebab 1 days ago [-]
The big deal is that GitHub provides it for free. Plus it integrated properly into the PR workflow.
Good luck implementing merge queues yourself. As far as I know there are no maintained open source implementations of merge queues. It's definitely not as trivial as you claim.
prmoustache 1 days ago [-]
> it's simply no longer reasonable to BYO CI
Why? I know plenty of teams which are fine with repo and CI being separate tools as long as there is integration between the 2.
CuriouslyC 1 days ago [-]
Actions are bad, but they're free (to start) and just good enough that they're useful to set up something quick and dirty, and tempt you to try and scale it for a little while.
knocte 1 days ago [-]
Exactly. Any github alternative needs to consume same GithubActions syntax OOTB I'm afraid.
rmunn 20 hours ago [-]
Which, as far as I found so far, means Forgejo. Haven't found any others. And even Forgejo Actions says that it's mostly the same as Github Actions syntax, meaning you still have to double-check that everything still works the same. It probably will, but if you don't know what the corner cases are then you have to double-check everything. Still, it's probably the best migration option if you rely on GHA.
suslik 17 hours ago [-]
Gitea also, I think.
woodruffw 1 days ago [-]
Emphasis on teams; the median open source project has a fraction of a single person working on it.
wongarsu 1 days ago [-]
CI needs good integration into the source forge. But I don't really perceive Github actions as a huge benefit over the times when everone just set up CircleCI or whatever. As long as it can turn PR checks red, yellow and green and has a link to the logs I'm happy
The whole PR and code review experience is much more important to me. Github is striving to set a high bar, but is also hilariously bad in some ways. Similarly the whole issue system is passable on Github, but doesn't really reach the state of the art of issue systems from 20 years ago
1 days ago [-]
psychoslave 1 days ago [-]
Working with all these modern layers, I don't see why people bother so much about it. This is all upper level decision to centralize so they feel they keep control. As a dev I'm 100% confident life would be as least as pleasant without all this abysmal layers of remote services that could all be replaced with distributed solutions that work 100% in local with thin sync step here and there.
WhyNotHugo 1 days ago [-]
> it's simply no longer reasonable to BYO CI or accept one that can't natively build for a common set of end-user architectures.
GitHub’s hosted runners support a grant total of two architectures.
The only forges which I’ve seen with more variety are distributios’ forges usually hosting their own runners.
woodruffw 1 days ago [-]
I probably should have said platform tuples, I guess. I'm not aware of a free alternative to GitHub that gives me (AMD64, ARM64) x (macOS, Windows, Linux).
1 days ago [-]
jamiemallers 1 days ago [-]
[dead]
asim 1 days ago [-]
Why? I want to understand why? Out of principle? I think some services just end up becoming foundational and we need to move on to other things for other things e.g if we're going to replace GitHub it's because we're creating new habits. Not because we're replacing like for like. That never works. What is a new code hosting platform offering. You know what, pair it with some app dev and great, now you've got something. But just hosting elsewhere it's got to be a major step change the way GitHub was from sourceforge and self hosting. Inherently the social aspects drove that and the power of git. Personally I think you have to intertwine the code hosting with app development using agents like a Google doc. Commits everytime there is a change. Every prompt creates a commit. I don't know. We don't need to reinvent the wheel for nothing.
finnjohnsen2 1 days ago [-]
Geopolitics is a reason. Many individuals and companies are scrambling for safe alternatives to US tech. I live in Norway and there is a lot of this going on.
voxic11 1 days ago [-]
> What is a new code hosting platform offering.
For me its providing uptime. Github is barely reaching one nine of availability these days.
0x3f 1 days ago [-]
Why does this post exist? I assume because of the Copilot story that's also trending and the subsequent loss of trust. Not sure if Github has serious alternatives, but the desire to move is not for 'nothing'.
999900000999 1 days ago [-]
GitHub gives you a lot for "free". In exchange they'll have no problem harvesting your data, and it would really surprise me if they aren't training on private repos too. I guess you can opt out and if they're opt out doesn't work oh well.
On the other hand Codeberg doesn't let you create private repositories at all. So Copilot could still legally scrape your open source Codeberg repos.
I don't see much of a point for most people.
https://docs.codeberg.org/getting-started/faq/
>If you need private repositories for commercial projects (e.g. because you represent a company or are a developer that needs a space to host private freelance projects for your clients), we would highly recommend that you take a look at Forgejo. Forgejo is the Git hosting software that Codeberg runs. It is free software and relatively easy to self-host. Codeberg does not offer private hosting services.
ashton314 20 hours ago [-]
You can have private repositories. It's discouraged because Codeberg is meant for FOSS projects, but it's totally possible.
b00ty4breakfast 1 days ago [-]
>On the other hand Codeberg doesn't let you create private repositories at all.
are you sure about that? I'm fairly certain my repos on codeberg are all private but I could be mistaken.
mfenniak 1 days ago [-]
It is kinda incorrect and kinda correct. Codeberg allows you to create private repositories. However, their rules are clear that the intent of private repositories must be in support of Free software projects: https://docs.codeberg.org/getting-started/faq/#how-about-pri..., which for many people is effectively not allowing private repositories.
arcanemachiner 1 days ago [-]
My reading is: Don't attempt to store your media (photos, music, videos, etc.) collection there, and you won't have any issues.
shunia_huang 17 hours ago [-]
I don't like how it reads differently based on perspectives, they should be crystal clear on what's allowed and what's not, but when they say '...unhappy about...' as a statement, it feels kinda amateur to me although I'm pretty sure the operators must be way capable than me lol.
arcanemachiner 2 hours ago [-]
Totally agreed. I get the feeling they have not yet had to draw too many lines in the sand... Yet. So they're probably keeping it flexible until they have to really start swinging the hammer.
erdaniels 1 days ago [-]
I just migrated our entire company off of github to gitlab self-hosted. So far so good. It's entirely behind tailscale so we don't have any SSO tax from gitlab and all of our CI runners are on EKS + an on-prem cluster with GPUs. If anyone needs help or motivation accomplishing the same, just reach out!
literallyroy 1 days ago [-]
Does self-hosted support SCIM or something else easy to connect for automatic user provisioning? Or do yal do account creations manually?
erdaniels 1 days ago [-]
it does support it (https://docs.gitlab.com/administration/settings/scim_setup/) but we're not using it. We just do manual invites for now at our size. As far as kicking users out goes, if they're kicked out of gsuite, they're kicked out of tailscale, and they lose their gitlab access.
zufallsheld 1 days ago [-]
In corporate most people connect their gitlab to ldap or active directory and thus accounts get created automatically.
bachittle 1 days ago [-]
Did you also try Forgejo? If so, what are the differences between the two? I didn't even know GitLab had a self-hosted option. I assume it's probably better for Enterprise-grade projects, and dealing with CI/CD, actions, etc. But for smaller projects that just have issues and PRs and minor test suites, I assume Forgejo is the better lightweight option.
erdaniels 1 days ago [-]
Yeah I tried hosting forgejo and the first issue I found was that it was crashing some of the time with our large monorepo and getting actions/runners up and running was proving time consuming; I really did like how lightweight it was, monolith wise. gitlab has a lot more architecture behind it but the documentation is very good at describing how you should configure it for your needs.
I think Forgejo would work fine for smaller projects and teams. We really wanted to stop having to worry about GitHub going and not being able to do CD as well as get away from a lot of the action zero-days happening.
And yes, it's self-hosted and free! You can run a reference implementation pretty easily with non-production components (i.e. they won't backup or scale well).
INTPenis 1 days ago [-]
Lazy has nothing to do with it, codeberg simply doesn't work.
Most of my friends who use codeberg are staunch cloudflare-opponents, but cloudflare is what keeps Gitlab alive. Fact of life is that they're being attacked non-stop, and need some sort of DDoS filter.
Codeberg has that anubis thing now I guess? But they still have downtime, and the worst thing ever for me as a developer is having the urge to code and not being able to access my remote. That is what murders the impression of a product like codeberg.
Sorry, just being frank. I want all competitors to large monopolies to succeed, but I also want to be able to do my job/passion.
embedding-shape 1 days ago [-]
Maybe I'm too old school, but both GitHub and Codeberg for me are asyncronous "I want to send/share the code somehow", not "my active workspace I require to do work". But reading
> the worst thing ever for me as a developer is having the urge to code and not being able to access my remote.
Makes it seem like GitHub/Codeberg has to be online for you to be able to code, is that really the case? If so, how does that happen, you only edit code directly in the GitHub web UI or how does one end up in that situation?
freedomben 1 days ago [-]
For me it's a soft block rather than a hard block. I use multiple computers so when I switch to the other one I usually do a git pull, and after every commit I do a push. If that gets interrupted, then I have resort to things like rsyncing over from the other system, but more than once I've lost work that way. I'm strongly considering just standing up a VM and using "just git" and foregoing any UI, but I make use of other features like CI/CD and Releases for distribution, so the VM strategy is still just a bandaid. When the remote is unavailable, it can be very disruptive.
embedding-shape 1 days ago [-]
> If that gets interrupted, then I have resort to things like rsyncing over from the other system
I'm guessing you have SSH access between the two? You could just add it as another remote, via SSH, so you can push/pull directly between the two. This is what I do on my home network to sync configs and other things between various machines and OSes, just do `git remote add other-host git+ssh://user@10.55/~/the-repo-path` or whatever, and you can use it as any remote :)
Bonus tip: you can use local paths as git remote URLs too!
> but more than once I've lost work that way.
Huh, how? If you didn't push it earlier, you could just push it later? Some goes for pull? I don't understand how you could lose anything tracked in git, corruption or what happened?
freedomben 1 days ago [-]
Usually one of two things, mostly the latter: I forget to exclude all the .git/ directory from the sync, or I have in-progress and nowhere near ready for commit changes on both hosts, and I forget and sync before I check. These are all PEBKAC problems and/or workflow problems, but on a typical day I'll be working in or around a half-dozen repos and it's too easy to forget. The normal git workflow protects from that because uncommitted changes in one can just be rebased easily the next time I'm working in that on any given computer. I've been doing it like this for nearly 20 years and it's never been an issue because remotes were always quite stable/reliable. I really just need to change my worfklow for the new reality, but old habits die hard.
SoftTalker 1 days ago [-]
> just standing up a VM and using "just git"
That's what I do. Control your entire world yourself.
messe 1 days ago [-]
If you can rsync from the other system, and likely have an SSH connection between them, why don't you just add it as an additional remote and git pull from it directly?
freedomben 1 days ago [-]
I probably could. How does that work with uncommitted changes on the host? Would that be a problem?
rlpb 1 days ago [-]
You cannot git push something that is not committed. The solution is to commit often (and do it over ssh if you forget on a remote system). It doesn't need to a presentable commit. That can be cleaned up later. I use `git commit -amwip` all the time.
Sure, you might neglect to add a file to your commit, or commit at all, but that's a problem whether you're pushing to a central public git forge or not.
debugnik 1 days ago [-]
You'd create a bare git repo (just the contents of .git) on the host with git init --bare, separate from your usual working tree, and set it as a remote for your working trees, to which you can push and pull using ssh or even a path from the same machine.
thwarted 1 days ago [-]
If you have ssh access to the remote machine to set up a git remote, you can login to the remote machine and commit the changes that you forgot to commit.
(You'll need to season to taste: what to do with staged changes, how to make sure both trees are in the same HEAD, etc)
cyberrock 1 days ago [-]
For some projects, the issue tracker is a pretty integral part of the documentation. Sure, you can host your own issue tracker somewhere, but that's still shifting a center point somewhere, in a theoretically decentralized system. I've frequently wished the issue tracker was part of the repository. Also -- love them or hate them -- LLMs would probably love that too.
the_mitsuhiko 1 days ago [-]
My main exposure to Codeberg is Zig and it has an issue tracker there and I pull in changes from it.
For how infrequent I interface with Codeberg I have to say that my experience has been pretty terrible when it comes to availability.
So I guess the answer is: the availability is bad enough that even infrequent interactions with it are a problem.
23 hours ago [-]
dspillett 1 days ago [-]
> Makes it seem like GitHub/Codeberg has to be online for you to be able to code, is that really the case?
I can understand that work with other active contributors, but I agree with you that it is a daft state of affairs for a solo or mostly-solo project.
Though if you have your repo online even away from the big places, it will get hit by the scrapers and you will end up with admin to do because of that, even if it doesn't block your normal workflow because your main remote is not public.
pferde 1 days ago [-]
I was shaking my head in disbelief when reading that part too. I mean, git's whole raison d'etre, back when it was introduced, was that you do not need online access to the repo server most of the time.
sodapopcan 1 days ago [-]
It's getting even worse if you read the thread about Claude going down the other day. People were having mini panic attacks.
dspillett 1 days ago [-]
> git's whole raison d'etre […] was that you do not need online access to the repo server most of the time
Not really. The point of git was to make Linus' job of collating, reviewing, and merging, work from a disparate team of teams much less arduous. It just happens that many of the patterns needed for that also mean making remote temporarily disconnected remote repositories work well.
dwedge 1 days ago [-]
The whole point of git was tm be a replacement for BitKeeper after the Linux developers got banned from it for "hacking" after Andrew Tridgell connected to the server over telnet and typed "HELP"
dspillett 1 days ago [-]
That too, though the point of using a distributed code control system was the purpose I mentioned. But even before BitKeeper getting in a tizzy about Tridgell's¹ shenanigans there was talk of replacing it because some properties of it were not ideal for something as large as the kernel with as many active contributors, and there were concerns about using a proprietary product to manage the Linux codebase. Linus was already tinkering with what would become the git we know.
--------
[1] He did a lot more than type “help” - he was essentially trying to reverse engineer the product to produce a compatible but more open client that gave access to metadata BitKeeper wanted you to pay to be able to access² which was a problem for many contributors.
[2] you didn't get the fulllest version history on the free variants, this was one of the significant concerns making people discuss alternatives, and in some high profile cases just plain refuse to touch BitKeeper at all
ragall 1 days ago [-]
> I mean, git's whole raison d'etre, back when it was introduced, was that you do not need online access to the repo server most of the time.
So what ? That's not how most people prefer to use it.
bigstrat2003 17 hours ago [-]
So those people are using the tool incorrectly, and would have a much better experience if they used it as designed. If everyone was running around using screwdriver handles to pound in nails, that wouldn't make it reasonable to say that any new screwdriver company has to have 5 lb handles.
ragall 9 hours ago [-]
> So those people are using the tool incorrectly
They're not. They're using it very correctly, by choosing to ignore a capability that's irrelevant to them.
> If everyone was running around using screwdriver handles to pound in nails
Stop thinking by analogy. It harms your ability to think correctly.
tonymet 1 days ago [-]
You’re right this is the proper way to use git. And I encourage developers to use their own cloud storage (or remote volume) for their primary remote.
Even with the best habits, there will be the few times a month where you forgot to push everything up and you’re blocked from work.
Codeberg needs to meet the highest ability levels for it to be viable.
freedomben 1 days ago [-]
I've had the same experience.
Philosophically I think it's terrible that Cloudflare has become a middleman in a huge and important swath of the internet. As a user, it largely makes my life much worse. It limits my browser, my ability to protect myself via VPNs, etc, and I am just browsing normally, not attacking anything. Pragmatically though, as a webmaster/admin/whatever you want to call it nowadays, Cloudflare is basically a necessity. I've started putting things behind it because if I don't, 99%+ of my traffic is bots, and often bots clearly scanning for vulnerabilities (I run mostly zero PHP sites, yet my traffic logs are often filled with requests like /admin.php and /wp-admin.php and all the wordpress things, and constant crawls from clearly not search engines that download everything and use robots.txt as a guide of what to crawl rather than what not to crawl. I haven't been DDoSed yet, but I've had images and PDFs and things downloaded so many times by these things that it costs me money. For some things where I or my family are the only legitimate users, I can just firewall-cmd all IPs except my own, but even then it's maintenance work I don't want to have to do.
I've tried many of the alternatives, and they often fail even on legitimate usecases. I've been blocked more by the alternatives than I have by Cloudflare, especially that one that does a proof of work. It works about 80% of the time, but that 20% is really, really annoying to the point that when I see that scren pop up I just browse away.
It's really a disheartening state we find ourselves in. I don't think my principles/values have been tested more in the real world than the last few years.
rglullis 1 days ago [-]
Either I am very lucky or what I am doing has zero value to bots, because I've been running servers online for at least 15 years, and never had any issue that couldn't be solved with basic security hygiene. I use cloudflare as my DNS for some servers, but I always disable any of their paid features. To me they could go out of business tomorrow and my servers would be chugging along just fine.
j16sdiz 1 days ago [-]
Sometime it is not security , it could be just bandwidth or CPU.
I have website small enough not to attract too many bot, but sometime, something very innocent can bring my website down.
For example, I put a php ical viewer.. and some crawler start loading the calendar page, taking up all the cpu cycle.
rglullis 1 days ago [-]
Even the most minimal protection would stop that.
1 days ago [-]
dspillett 1 days ago [-]
> and use robots.txt as a guide of what to crawl rather than what not to crawl
Mental note, make sure my robots.txt files contain a few references to slowly returning pages full of almost nonsense that link back to each other endlessly…
Not complete nonsense, that would be reasonably easy to detect and ignore. Perhaps repeats of your other content with every 5th word swapped with a random one from elsewhere in the content, every 4th word randomly misspelt, every seventh word reversed, every seventh sentence reversed, add a random sprinkling of famous names (Sir John Major, Arc de Triomphe, Sarah Jane Smith, Viltvodle VI) that make little sense in context, etc. Not enough change that automatic crap detection sees it as an obvious trap, but more than enough that ingesting data from your site into any model has enough detrimental effect to token weightings to at least undo any beneficial effect it might have had otherwise.
And when setting traps like this, make sure the response is slow enough that it won't use much bandwidth, and the serving process is very lightweight, and just in case that isn't enough make sure it aborts and errors out if any load metric goes above a given level.
matrss 1 days ago [-]
So, basically iocaine (https://iocaine.madhouse-project.org/). It has indeed been very useful to get the AI scraper load on a server I maintain down to a reasonable level, even with its not so strict default configuration.
First time seeing that, but yes, seems similar in concept. Iocaine can be self-hosted and put in as a "middleware" in your reverse proxy with a few lines of config, cloudflare's seems tied to their services. Cloudflares also generates garbage with generative models, while iocaine uses much simpler (and surely more "crude") methods of generating its garbage. Using LLMs to feed junk to LLMs just makes me cry, so much wasted compute.
Is iocaine actually newer though? Its first commit dates to 2025-01, while the blog post is from 2025-03. I couldn't find info on when Cloudflare started theirs. There's also Nepenthes, which had its first release in 2025-01 too.
johnisgood 1 days ago [-]
If I think about it, I find it awful. The fact that we need to put junk in our own stuff just for crawlers does not sit well with me.
dspillett 15 hours ago [-]
Yes, except with the content being based on the real content rather than completely random. My intuition says that this will be more effective, specifically poisoning the model wrt tokens relating to that content rather than just increasing the overall noise level a bit (the damage there being smoothed out over the wider model).
1 days ago [-]
freedomben 1 days ago [-]
Hot damn, this is a great idea! Reminds me fondly of an old project a friend and I built that looks like an SSH prompt or optionally an unauthed telnet listener, which looks and feels enough like a real shell that we would capture some pretty fascinating sessions of people trying to explore our system or load us with malware. Eventually somebody figured it out and then DDoSed the hell out of our stuff and would not stop hassling us. It was a good reminder that yanking people's chains sometimes really pisses them off and can attract attention and grudges that you really don't want. My friend ended up retiring his domain because he got tired of dealing with the special attention. It did allow us to capture some pretty fascinating data though that actually improved our security while it lasted.
Ferret7446 1 days ago [-]
This is one reason why most crawlers ignore robots.txt now. The other reason is that bandwidth/bots are cheap enough now that they don't need web admins to help them optimize their crawlers
dspillett 7 hours ago [-]
> This is one reason why most crawlers ignore robots.txt now.
I don't buy that for a second. Those not obeying robots.txt were doing so either because they were malicious (they wanted everything and wouldn't be told “please don't plough through these bits”) or stupid (not knowing any better) or both.
Anyone who was obeying robots.txt isn't going to start ignoring it because we've put honeypots there. Why would they think “well, now there are honeypots there I'm going to go scan those… honypots, yeah, that's a good idea”.
> The other reason is that bandwidth/bots are cheap enough now that they don't need web admins to help them optimize their crawlers
Web admins are not trying to optimize their crawlers, they are trying to stop their crawlers breaking sites.
dwedge 1 days ago [-]
While I sympathise, I disagree with your stance. Cloudflare handle a large % of the Internet now because of people putting sites that, as you admitted, don't need to be behind it there.
kitsune1 1 days ago [-]
[dead]
1 days ago [-]
frevib 1 days ago [-]
OP is about Github. Have you seen the Github uptime monitor? It’s at 90% [1] for the last 90 days. I use both Codeberg and Github a lot and Github has, by far, more problems than Codeberg. Sometimes I notice slowdowns on Codeberg, but that’s it.
I stopped using GitHub a long time ago. I don't understand why gitlab isn't the default alternative?
throwaway28731 1 days ago [-]
[dead]
kevinfiol 1 days ago [-]
To be fair, Github has several magnitudes higher of users running on it than Codeberg. I'm also a Codeberg user, but I don't think anyone has seen a Forgejo/Gitea instance working at the scale of Github yet.
jrudolph 1 days ago [-]
To be fair, GitHub has several magnitudes higher of revenue to support that. Including from companies like mine who are paying them good money and get absolutely sub-par service and reliability from them. I'd be happy for Codeberg to take my money for a better service on the core feature set (git hosting, PRs, issues). I can take my CI/CD elsewhere, we self-host runners anyway.
apetresc 1 days ago [-]
I don't think OP was making a value judgment or anything. It's just weird to say you won't consider Codeberg because you need reliability when Codeberg's uptime is at 100% and Github's is at 90%.
era-epoch 1 days ago [-]
I think the idea is that a Forgejo/Gitea instance should never have to work at anywhere near the scale of GitHub. Codeberg provides its Forgejo host as a convenience/community thing but it's not being built to be a central service.
kjuulh 1 days ago [-]
My own git server has been hit severely by scrapers. They're scraping everything. Commits, comparisons between commits, api calls for files, everything.
And pretty much all of them, ByteDance, OpenAI, AWS, Claude, various I couldn't recognize. I basically just had to block all of them to get reasonable performance for a server running on a mini-pc.
I was going to move to codeberg at some point, but they had downtime when I was considering it, I'd rather deal with that myself then.
marginalia_nu 1 days ago [-]
Anyone actually scraping git repos would probably just do a 'git clone'. Crawling git hosts is extremely expensive, as git servers have always been inadvertent crawler traps.
They generate a URL for every version of every file on every commit and every branch and tag, and if that wasn't enough, n(n+1)/2 git diffs for every file on every commit it has exited on. Even a relatively small git repo with a few hundred files and commit explodes into millions of URLs in the crawl frontier. Server side many of these are very expensive to generate as well so it's really not a fantastic interaction, crawler and git host.
If you run a web crawler, you need to add git host detection to actively avoid walking into them.
Tharre 1 days ago [-]
And yet, it's exactly what all the AI companies are doing. However much it costs them in server costs and good will seems to be worth less to them then the engineering time to special case the major git web UIs.
marginalia_nu 1 days ago [-]
I doubt they're actually interested in the git repos.
From the shape of the traffic it just looks like a poorly implemented web crawler. By default, a crawler that does not take measures to actively avoid git hosts will get stuck there and spend days trying to exhaust the links of even a single repo.
kjuulh 24 hours ago [-]
For me it was specifically crawlers from the large companies, they we're at least announcing themselves as such. They did have different patterns, bytedance was relatively behaved, but some of the less known ones, did have weird patterns of looking at comparisons.
I do think they care about repos, and not just the code, but also how it evolves over time. I can see some use, if marginal in those traits. But if they really wanted that, I'd rather they clone my repos, I'd be totally fine with that. But i guess they'd have to deal with state, and they likely don't want to deal with that. Rather just increase my energy bill ;)
Eldt 1 days ago [-]
How probable is your "probably"?
marginalia_nu 1 days ago [-]
Well, one is 60 repos per hour, and the other is 60 hours per repo.
prmoustache 1 days ago [-]
The whole point of git is to be decentralized so there is no reason for you to not have your current version available even when a remote is offline.
mr_mitm 1 days ago [-]
How do people even on hacker news of all places conflate git with a code hosting platform all the time? Codeberg, GitHub or whatever are for tracking issues, running CI, hosting builds, and much more.
The idea that you shouldn't need a code hosting platform because git is decentralized is so out of place that it is genuinely puzzling how often it pops up.
prmoustache 1 days ago [-]
How do people on hacker news keep having reading issues?
The parent post mentionned:
"the worst thing ever for me as a developer is having the urge to code and not being able to access my remote."
Emphasis one "code", not triaging issues, merging other people's branches, etc.
Besides there are tools to sync forgejo repositories including PRs and issues.
hombre_fatal 1 days ago [-]
OP didn't conflate them.
They said they want to be able to rely on their git remote.
The people responding are saying "nah, an unreliable remote is fine because you can use other remotes" which doesn't address their problem. If Codeberg is unreliable, then why use it at all? Especially for CI, issues, and collab?
keybored 1 days ago [-]
The person you’re replying to is saying that you can do everything outside of tracking issues, running CI, ... without a remote. Like all Git operations that are not about collaboration. (but there is always email)
Maybe a hard blocker if you are pair programming or collaborating every minute. Not really if you just have one hour to program solo.
10 hours ago [-]
dandellion 1 days ago [-]
It's also trivial to have multiple remotes, I do in most of my repos. When one has issues I just push to the other instead of both.
ragall 1 days ago [-]
The original intent of the authors is by now irrelevant. The current "point" of git is that it's the most used version control solution, with good tooling support from third parties. Nothing more. And most people prefer to use it in a centralised fashion.
prmoustache 1 days ago [-]
That doesn't remove the fact that when people are working on the code, their local copy doesn't disappear after they pushed their commits and a local copy is still available.
Only exception is when people are using the code editor embedded in the "forge" but this is usually an exceptional use rather than the norm.
ragall 21 hours ago [-]
> That doesn't remove the fact that when people are working on the code, their local copy doesn't disappear after they pushed their commits and a local copy is still available.
It doesn't remove it but doesn't make it very relevant either, because of all the tests that are necessarily done remotely and can't be done locally, and without that feedback in many cases development is not possible.
zelphirkalt 1 days ago [-]
Probably has happened at some point, but personally, I have not been hit with/experienced downtime of Codeberg yet. The other day however GitHub was down again. I have not used Gitlab for a while, and when I used it, it worked fine, and its CI seems saner than Github's to me, but Gitlab is not the most snappy user experience either.
Well, Codeberg doesn't have all the features I did use of Gitlab, but for my own projects I don't really need them either.
> for me as a developer is having the urge to code and not being able to access my remote
I think that's the moment when you choose to self host your whatever git wrapper. It really isn't that complicated to do and even allows for some fun (as in cheap and productive) setups where your forge is on your local network or really close to your region and you (maybe) only mirror or backup to a bigger system like Codeberg/GitHub.
In our case, we also use that as an opportunity to mirror OCI/package repositories for dependencies we use in our apps and during development so not only builds are faster but also we don't abuse free web endpoints with our CI/CD requests.
nfredericks 1 days ago [-]
I agree. I switched to Codeberg but switched back after a few months. Funny enough, I found there to be more unreported downtime on Codeberg than GitHub.
1 days ago [-]
maelito 1 days ago [-]
> Lazy has nothing to do with it, codeberg simply doesn't work.
Been working on it for months now, it does work, lol.
z3t4 1 days ago [-]
I find irony in that Git was made to get rid of central repos, and then we re-introduce them.
johnisgood 1 days ago [-]
That is what we have been doing for quite some time now, from what I gathered. Every time I see something becoming popular, I am like "Hmm, I've seen this before", and I really have. They just gave it a fancier name with a fancier logo and did some marketing and there you go, old is new.
mixmastamyk 1 days ago [-]
[flagged]
youarewashed 1 days ago [-]
[flagged]
DaSHacka 1 days ago [-]
Thanks for your input
mplanchard 1 days ago [-]
I've been mostly off the GitHub train since the MS acquisition, and think any alternative is a good alternative. Codeberg is great.
I've also been very happy with sourcehut for most of my personal projects for some time. The email patch submission workflow is a tad bit unfamiliar for most, but IMO in today's era raising that barrier to entry is mostly a good thing for OSS projects.
I also strongly prefer a simple CI environment (where you just run commands), which encourages you to actually be able to run your CI commands locally.
ponkpanda 1 days ago [-]
Repo hosting is the kind of thing that ought to be distributed/federated.
The underlying protocol (git) already has the cryptographic primitives that decouples trust in the commit tree (GPG or SSH signing) with trust in the storage service (i.e. github/codeberg/whatever).
All you need to house centrally is some SSH and/or gpg key server and some means of managing namespaces which would benefit from federation as well.
You'd get the benefits of de-centralisation - no over-reliance on actors like MS or cloudflare. I suppose if enough people fan out to gitlab, bitbucket, self hosting, codeberg, you end up with something that organically approximates a formally decentralised git repo system.
> Repo hosting is the kind of thing that ought to be distributed/federated.
Hence Tangled and ForgeFed (which I believe is integrating in Forejo)
mikepurvis 1 days ago [-]
I hadn't heard of either of these, but I'm interested.
I think at this point the bigger barrier to me with leaving GitHub (professionally, at least) is all the non-GitHub stuff that integrates nicely with it and badly or not at all with other solutions. And like, I don't blame tool providers for making a rational economic choice in that regard, but if leaving GitHub means leaving seamless Sentry, Depot, Linear, editor plugins, AI integrations, etc that makes it a tougher pill to swallow.
I worked for years at a shop that had in-house GitLab and we felt this pain first hand all the time. GitLab tries to be a one-stop shop and own the whole project management and testing/deployment workflow by building everything in house, but there were always gaps and it was hard not to be jealous of places that just did everything on GitHub and could use whatever best in class saas stuff they wanted.
Gitlab has been tracking a federation feature since at least 2018 [1], and I expect bitbucket, sourcehut, gitea, and others would move quickly on something like this as well, but there needs to be a protocol defined and some kind of plan for handling spam/abuse.
Might be more difficult for people with private repos, as I recall Codeberg doesn't like private repos on their platform.
xeeeeeeeeeeenu 1 days ago [-]
If you have a server, some cheap VPS will suffice, you can host a private git repo there without installing anything. Run this on your server:
git init --bare foo.git
and then on your PC you can do this:
git clone user@yourserver.com:~/foo.git
It's probably a good idea to make a separate user account on the server for it, though.
throwa356262 1 days ago [-]
This is great, but you can also run foregjo (the server behind coderberg) on your VPS.
It is a single binary and I think it is also very light on resources. At least compared to gitlab.
dqv 1 days ago [-]
I've actually been meaning to set up a forgejo instance on pikapods. Apparently it's 2 USD/month to do it.
systems 1 days ago [-]
I just noticed this, they dont allow private repos (with few exceptions)
I wonder why they dont just offer unlimited private repos for (reasonably) paid accounts , I think maybe a 40 dollar per year (or 4 dollar monthly), is low and encouraging , and should be welcomed by many , I hope they consider it
wongarsu 1 days ago [-]
Codeberg is a German nonprofit. To keep their tax-advantaged status, anything they do has to follow the purpose established in their bylaws. That purpose is "to promote the creation, collection, distribution and preservation of Free Content (Open Content, Free Cultural Works) and Free and Open Source Software (FOSS) and their documentation in selfless work to enable equal opportunities regarding the access to knowledge and education. Furthermore, this also intends to raise awareness for the social and philosophical questions interconnected with this."
I imagine they would argue that private repositories do not follow this purpose, as they are neither free content nor FOSS. I believe you could argue that charging a modest fee for private repositories to finance the hosting of FOSS repositories is in line with the purpose, but you get on thinner ice with that. It could quickly make them appear more like a company than like a nonprofit
mrbluecoat 1 days ago [-]
Is there a "Moving open source search from GitHub to XYZ, for lazy people"? When I'm looking for solutions to problems that open source might be able to solve, I find the fracturing of code hosting platforms an annoyance.
fhennig 1 days ago [-]
Can you elaborate what the problem is? IMO hosting and search are quite decoupled, why not just search for "open source solution to problem XYZ" in your favorite search engine?
mrbluecoat 1 days ago [-]
I specifically like the filtering to say "permissive license in Go language"
ramon156 1 days ago [-]
Instead of "moving", let's at least have some mirrors up. Mirror all your repos to CB, make a FOSS tool to do this automagically for you. Let users be able to just click a button and boom, mirror.
The goal is to get at least a % available on CB, then we can think about where the community is
Aperocky 1 days ago [-]
Why didn't codeberg make this FOSS tool though? Seems natural they should.
r14c 1 days ago [-]
They disabled the mirrors feature because they didn't want a bunch of accounts mirroring large repos and doing nothing else.
Forgejo does support mirrors, just not codeberg.
sibtain1997 11 hours ago [-]
This is really helpful. I always assumed the migration itself would be the hardest part, but the CI point makes a lot of sense.
How has reliability been so far compared to GitHub, especially on bigger repos?
Last commit is two years ago, but still a good overview.
1 days ago [-]
jenadine 16 hours ago [-]
The main reason I use GitHub is because of the network effect.
Most people have already a GitHub account and can easily open PR, issues, discussions.
So I'll wait until some big projects do it.
For example, the day the Rust project moves away from GitHub I'll seriously consider. (Because I use Rust)
tom-blk 10 hours ago [-]
Interesting insight, was going to look into migrating some stuff there anyway, but didn't get around to it yet. Thanks for the push!
packetlost 1 days ago [-]
tangled.org is another interesting take that's open source and built on ATProto (which I have mixed feelings about).
Also radicle.xyz
kps 1 days ago [-]
What I'd like to see is a lazy person's HOWTO for the last paragraph:
> You could tell Codeberg to push new commits to GitHub, but this allows users to still file PRs and comment on issues and commits 2. Some folks have dealt with this by disabling issues on the GitHub repo, but that is a really destructive action as it will 404 all issues, and pull requests cannot be disabled. Some repos like libvirt/libvirt have written a GitHub Action that automatically closes all pull requests.
I have enjoyed using Forgejo over GitHub for local work. The features that GitHub has that plain Git does not includes a nice web renderer of markdown and code, issues and pull requests with comments, and project kanban boards. It's nice to have an alternative for local usage if GitHub ever goes down or just for private projects. Especially nice with agentic workflows, because agents can port issues, PRs, etc. back and forth between GitHub and Forgejo.
askonomm 1 days ago [-]
I'm self-hosting Forgejo on my own home server. It's super easy to do via Docker or as a single binary executable. I even have CI/CD runners on it, which was also very easy to set up. Definitely recommend for those who might not want to rely on someone else, be it Codeberg or not, but still get the same quality as Codeberg (as they literally run Forgejo themselves).
codazoda 1 days ago [-]
I love the simple design of the page. This is a random observations, but I noticed the author has an interesting "likes" button that is served from an API on https://dddddddddzzzz.org, a curious and interesting looking domain. I'll have to go dig around his blog to see if he's written about this.
KomoD 1 days ago [-]
> This is a random observations, but I noticed the author has an interesting "likes" button that is served from an API on https://dddddddddzzzz.org, a curious and interesting looking domain. I'll have to go dig around his blog to see if he's written about this
I have done this with the help of Claude. codeberge first, using static host for my pages and a Hetzner woodpeckr for a tiny amount of testing. working well so far.
mtlynch 1 days ago [-]
>The by far nastiest part is CI. GitHub has done an excellent job luring people in with free macOS runners and infinite capacity for public repos.
This was my biggest blocker as well, as there weren't any managed CIs that supported Codeberg until recently.
NixCI[0] recently added support for Codeberg, and I've had a great experience with it. The catch is that you have to write your CI in Nix, though with LLMs, this is actually pretty easy. Most of my CI jobs are just bash scripts with some Nix wiring on top.[1] It also means you can reproduce all your CI jobs locally without changing any code.
I don’t know what everyone else thinks, but for me, GitHub is a shining star among most developer tools and services. Free hosting for code, sites, personal projects, no obligation to make your repositories public or buy a subscription. For most devs, they would never need to buy a subscription. Heck, I don’t need it, but I pay for it as a thank you for what I get (have gotten) out of it.
I am honestly shocked that they managed to survive and thrive after getting bought by MS! So please pardon my ignorance when I say that I don’t understand why some FOSS projects want to move away.
itskokeh 22 hours ago [-]
This is a valid take.
The sheer scale of GitHub and its economic backing by giants allows it to offer a lot, for FREE.
Self hosted projects cannot compete with this. They either get huge venture fund or remain small.
GitHub is not perfect, but nothing comes close
phyzome 21 hours ago [-]
Thrive? No. GH has been getting very noticeably worse. The uptime is terrible now and the UI is a lot flakier.
sheepscreek 6 hours ago [-]
So their usage has shot up post AI agents/agentic coding. We are seeing the effects via their struggle to keep up with demand. That by itself does not make the service bad. Claude did not even make it to two nines for January (98.9% availability).
Oh, I didn't know github had free macOS CI runners.
Maybe that would solve my dreadful upcoming issue that I'd have to update my mac to a version with glass to be able to build for the app store.
throwa356262 1 days ago [-]
Codeberg is not a 1-1 replacement for github/gitlab but for many people it is a better option.
I really wish there was a way to support with them a smaller amount then €24. I dont use codeberg myself but I really want to support them.
KomoD 1 days ago [-]
You can, €24 is just for the membership
Wire transfer is €10
Stripe is €5
With PayPal you can send €0.01 if you want
Or Liberapay, as little as €0.01 per week
arcanemachiner 1 days ago [-]
I send them a couple bucks a month via Liberapay. I'm on mobile and short on time, but you can customize the donation amount easily.
pfortuny 1 days ago [-]
I am really really amazed at how many people discount this alternative because it does not work but do not realize that they are being slaves to Microsoft by using Github. Honestly, I do not get it.
johnisgood 1 days ago [-]
Apparently they have issues with self-hosting and basic git usage so I am not surprised, but yes, so many open source advocates, yet they literally depend on Microsoft, a bit too much.
goku12 1 days ago [-]
> but yes, so many open source advocates, yet they literally depend on Microsoft, a bit too much.
I have abandoned github and even gitlab for all intents and purposes. But there's another side to consider in this.
It's always risky for the FOSS community to depend on a service that doesn't offer interoperability and freedom of migration. Ironically, Github is such a service built on a tool (git) that's built for maximum interop and migration. But the popularity of Github among the developer community isn't an accident. They worked really hard during their early stages as a startup, to gain the trust of community. Nobody foresaw Microsoft buying them at that stage (though you should really just assume that it would happen eventually).
The reluctance of a lot of them to abandon the platform can be attributed to lack of principles - IF it was an isolated incident. But we see the same story repeating with several development platforms. NPM is an example. PyPI and crates.io are still independent, as far as I know. But they aren't free of corporate influences either. No matter how much we try to avoid them, the companies just buy their way into these platforms when they become popular enough. I'm not happy with this. But I don't know a solution either.
p2detar 1 days ago [-]
> being slaves to Microsoft
An overly ideological PoV can make it easy to overlook that some people are simply on Github from a practical standpoint. I myself host Forgejo and moved a lot of stuff there. I don't really find a good reason to host anything on Codeberg, yet. Github still offers me a nice set of repos to find via the people I follow there.
d675 1 days ago [-]
What’s the hate on Microsoft?
pfortuny 1 days ago [-]
It's not hate, it is that everybody complains about their services and their predatory behavior but somehow Github gets a free pass. As if it were going to be free forever, and well maintained...
notepad0x90 18 hours ago [-]
My friend has a question, why? Is it a politics thing, or something Microsoft did recently?
itskokeh 22 hours ago [-]
Cobeberg hasn't gotten to a level where it can be considered seriously as a GitHub replacement.
jbaber 1 days ago [-]
Meet developers where they are: have multiple remotes at multiple forges and mailing lists.
Don't keep all your eggs in one basket.
elzbardico 1 days ago [-]
Really, they day I finally tire of github, I will just move to gitlab. git hosting is not something I want to wast my time yak shaving.
huijzer 1 days ago [-]
I'm running Forgejo for years now and I spend almost no time on it. I just host it with my other services. Backups automatically with Syncthing and I manually check in on the server and run apt-get upgrade once every two weeks.
lijunle 1 days ago [-]
The concerning part is commercial. That is why Cloudflare Pages/workers is a better option than GitHub pages or Vercel Pages.
steveharing1 1 days ago [-]
Having options is really important bcs relying heavily on one thing is not something that goes always well
gitprolinux 1 days ago [-]
I just have to say that I wrote my on hosting git service and eat my own stuff at gitbusiness.com
Arcuru 1 days ago [-]
Does any service offer hosted Forgejo Actions Runners? Or Forgejo compatible CI?
I want to pay for CI on my Codeberg projects, but I've been struggling to find something where I can just pay by the minute. I have projects that benefit from large CI runners but my usage is low enough that it makes no sense to host my own.
I_am_tiberius 1 days ago [-]
I wish they had a paid plan for private repositories that aren't FOSS.
stock_toaster 1 days ago [-]
I recently ran across codefloe[1] recently in another thread[2], and have been considering it for private non-floss-related repos... haven't tried it out yet though, so mileage may vary.
I've been using codefloe for everything personal, and I couldn't be happier. It's been stable, snappy, and offers me everything I need. I don't miss GitHub a single bit
stock_toaster 1 days ago [-]
Thanks for the positive anecdote. That gives me more motivation to finally try it out.
unwoven 1 days ago [-]
> The by far nastiest part is CI. GitHub has done an excellent job luring people in with free macOS runners and infinite capacity for public repos
Yup and this is where I pass on anything other than GitHub.
maccard 1 days ago [-]
Agreed - this is also where Github is the most unreliable. our _number one_ reason for build failures is "GHA being down/degraded" in 2026.
jamiemallers 1 days ago [-]
[dead]
conradev 1 days ago [-]
GitHub is free, but the runners are slow and increasingly unreliable.
I use Namespace (https://namespace.so) and I hook it up both to my personal GitHub as well as my personal Forgejo. I’m in the process of moving from the former to the latter!
ekropotin 1 days ago [-]
I didn’t really realize the degree of their slowness, until I migrated one of the projects on a self-hosted gitea and runners. This setup is just breezing! It’s an order of magnitude faster we’re talking about.
Granted, self-hosting git is not feasible for everyone, but GitHub + self hosted runners seems like a very good option.
thinkxl 1 days ago [-]
I've had a good experience with Woodpecker CI. I've heard that installation and integration with ForgeJo isn't easy, but I deploy everything to my homelab using Dokku, where I push a Dockerfile, mount a volume (on setup), and it's good to go.
I assume this isn't optimal for a business setup, but for personal projects, I don't miss GitHub Actions at all.
dangus 1 days ago [-]
I was going to say that I’d be happy to run a local Mac mini to be a runner but I noticed that Forgejo runners are only built for Linux.
It seems like to be a serious CI platform they really need to change Windows and Mac binaries for runners so you can build for those platforms.
And this is more of a Forgejo issue than a Codeberg issue specifically.
But also, I’d also throw out there the idea that CI doesn’t have to be at the same website as your source control. It’s nice that GitHub actions are conveniently part of the product but it’s not even really the top CI system out there.
mfenniak 1 days ago [-]
Forgejo is committed to using exclusively Free Software for it's own project development. Windows and Mac versions of the Forgejo Runner are built in the project's CI system as a minimal check to ensure platform compatibility, but due to the project's commitment, the project doesn't do integration testing on these platform. And therefore doesn't distribute untested software.
I get it for open source projects but at least use something nice like depot.dev for commercial ventures.
hrmtst93837 1 days ago [-]
[flagged]
Jotalea 1 days ago [-]
even better, selfhost your own gitea instance
1 days ago [-]
vaylian 1 days ago [-]
How do other people open pull/merge requests for your projects?
swiftcoder 1 days ago [-]
Honest question: do you want them to? Most of us aren't running high-profile OSS projects, and drive-by PRs are a pretty widespread complaint about GitHub's model of opensource
rmi_ 1 days ago [-]
Just push to that instance, or, as Linus intended, send patches via e-mail.
KomoD 1 days ago [-]
They make an account or you give them one?
fareesh 1 days ago [-]
never had a problem with github, i must be using it during the 90% of the time that it works
jedisct1 1 days ago [-]
Codeberg is great, but I really miss Octobox.
I can't imagine using GitHub without Octobox; it's just impossible to keep track of all the notifications by email.
Unfortunately, Octobox doesn't support GitHub, so I've no idea how to follow projects, even the ones I really want to contribute to.
jtfrench 1 days ago [-]
Haven’t heard of Codeberg. What are the top reasons to switch from GitHub?
holoduke 24 hours ago [-]
Seriously think of moving to my own private gitea. I mean in less than an hour I have it setup. Make some daily backups to a cheap bucket somewhere is enough. In addition I get unlimited git action time. That's currently my biggest cost in my private workflow.
dalvrosa 1 days ago [-]
Codeberg vs selfhosted Gitlab. What do you think?
real_joschi 1 days ago [-]
I think the question is rather gitlab.com vs. self-hosted GitLab and Codeberg vs. self-hosted Forgejo.
dalvrosa 13 hours ago [-]
Fair
jayd16 1 days ago [-]
For what its worth, it's pretty easy to maintain a low traffic Gitlab instance.
sergiotapia 1 days ago [-]
The problem is paas platforms and other platforms don't really support codeberg or any other alternatives.
Can I link a codeberg repo to Railway for example?
sylware 1 days ago [-]
codeberg.org still requires "javascript" aka one of the massive whatng cartel c++ written web engines. Do prefer sourcehut or others which are not web apps, in other words are web sites (classic web, or noscript/basic html for critical core functions at least, like issue tracking).
microsoft carefully broke classic web support overtime, THX AGAIN MICROSOFT, WE LOVE YOU!
rvz 1 days ago [-]
This was kind of predictable [0] and even self-hosting your own solution was done way before GitHub existed and now has better uptime than them.
Now they are turning GitHub into a canteen for AI agents and their AI chatbots (Copilot, Tay.ai and Zoe) to feed them on your code if you don't opt out.
> The by far nastiest part is CI. GitHub has done an excellent job luring people in with free macOS runners and infinite capacity for public repos
Hosting was never free and if you do not want Codeberg to go the way of GitHub, you need to pay for it.
Otherwise expect GitHub downtime to hit every week or so.
But that's the most important part. A repository without CI is basically dead.
The biggest challenge of this era is automated verification, and proper CI infrastructure is essential.
GitHub feels like what Hudson/Jenkins was some decades ago. Horrible, but the only one that did what it did.
I run probably hundreds of dollars of CI on GitHub per month. Except I don't pay a cent for it (all open source public repos). I can't just let that go, those workers do real work.
bigstrat2003 17 hours ago [-]
You do not in any way need CI to make software. It's a helpful tool, not a requirement.
gaigalas 17 hours ago [-]
Don't put words on my mouth.
jjslocum3 1 days ago [-]
I'm still more comfortable keeping my code in America.
maxdo 1 days ago [-]
Everything runs on servers that we control. We will not sell your data.
Hosted in Europe, we welcome the world.
```````
so it's you control, make money vs they control make money. what is the difference here , except some eu version of maga movement here?
aerzen 1 days ago [-]
Forgejo is also OSS, so you can easily migrate git and issues and prs and everything to your own hosted instance, if codeberg proves untrustworthy. Which I can't on github.
moritzruth 1 days ago [-]
> we welcome the world
> maga movement
paulddraper 1 days ago [-]
No service provider lock-in.
Codeberg is just a hosted instance of Forgejo (GPLv3).
They even support a workflow for migrating to a different Forgejo instance [1].
They also don't want to host your homepage, so if GitHub Pages is why you used GitHub, they are not a replacement.
Unfortunately I don't think there's really an answer to that conundrum that doesn't involve just spinning up your own git server and accepting all the operational overhead that comes with it. At least Forgejo (software behind Codeberg) is FOSS, so you can do that and it should cover most of what you need (and while you're in the realm of having a server, a Pages-esque replacement is trivial since you're configuring a webserver anyway.) Maybe Gitlab.com, although I am admittedly unfamiliar with how Gitlab's "main" instance has changed over the years wrt features.
Here's their FAQ on the matter, it's worth a read: https://docs.codeberg.org/getting-started/faq/
https://docs.codeberg.org/codeberg-pages/
> If you do not contribute to free/libre software (or if it is limited to your personal homepage), and we feel like you only abuse Codeberg for storing your commercial projects or media backups, we might get unhappy about that.
Emphasis mine. This isn't about if it's technically possible (it certainly is), it's whether or not it's allowed by their platform policies.
Their page publishing feature seems more like it's meant for projects and organizations rather than individual people. The way it's described here indicates that using them to host your own blog/portfolio/what have you is considered to be abusing their services.
All it shows the world is why there needs to be a VAT like tax against US digital services to help drive a public option for developers.
There's no reason why the people can't make our own solutions rather than be confined to abusive private US tech platforms.
A better alternative would be to create the incentives so that companies like these can be born in Europe.
You are against democracy, I am not. Democracy has led to some of the best advances of civilization, all oligarchies have done is introduce mass poverty, mass misery, and mass death.
At least with democracy we went to the moon for mankind, not shareholders.
My 2 cents.
Keep in mind, I'm not saying Codeberg is bad, but it's terms of use are pretty clear in the sense that they only really want FOSS and anyone who has something other than FOSS better look elsewhere. GitHub allowed you to basically put up anything that's "yours" and the license wasn't really their concern - that isn't the case with Codeberg. It's not about price or anything either; it'd be fine if the offer was "either give us 5$ for the privilege of private repositories or only publish and contribute public FOSS code" - I'm fine paying cash for that if need be.
One of the big draws of GitHub (and what got me to properly learn git) back in the day with GitHub Pages in particular was "I can write an HTML page, do a git push and anyone can see it". Then you throw on top an SSG (GitHub had out of the box support for Jekyll, but back then you could rig Travis CI up for other page generators if you knew what you were doing), and with a bit of technical knowledge, anyone could host a blog without the full on server stack. Codeberg cannot provide that sort of experience with their current terms of service.
Even sourcehut has, from what I can tell, a more lenient approach to what they provide (and the only reason why I wouldn't recommend sourcehut as a GitHub replacement is because git-by-email isn't really workable for most people anymore). They encourage FOSS licensing, but from what I can tell don't force it in their platform policies. (The only thing they openly ban is cryptocurrency related projects, which seems fair because cryptocurrency is pretty much always associated with platform abuse.)
I mean, it is arguably much easier to just write the HTML page and upload it with FTP and everyone can see it. I never understood why github became a popular place to host your site in the first place.
Easy: it was free, it was accessible to people that couldn't spend money for a hosting provider (read: high schoolers) and didn't impose arbitrary restrictions on what you were hosting.
Back then, your options as a high school student were basically to either try and reskin a closed off platform as much as you could (Tumblr could do that, but GitHub Pages also released in the time period where platforms were cracking down on all user customization larger than "what is my avatar") or to accept that the site you wanted to publish your stuff on could disappear at any moment the sketchy hosting provider that provided you a small amount of storage determined your bandwidth costs meant upselling you on the premium plan.
GitHub didn't impose those restrictions in exchange for being a bit less interactive when it came to publishing things (so no such thing as a comment section without using Disqus or something like that, and chances are you didn't need the comments anyways so win-win) That's why it got a lot more popular than just using an FTP server.
1) Passive mode. What is it and why do I need it? Well, you see, back in the old days, .... It took way too long for this critical "option" to become well supported and used by default.
2) Text mode. No, I don't want you to corrupt some of my files based on half-baked heuristics about what is and isn't a text file, and it doesn't make any sense to rewrite line endings anymore anyway.
3) Transport security. FTPS should have become the standard decades ago, but it still isn't to this day. If you want to actually transfer files using an FTP-like interface today, you use SFTP, which is a totally different protocol built on SSH.
And if you already knew how to write/make HTML you'd for sure already know all of that too.
I don't always have a server available to host an HTTP+FTP server on. Or want to pay for one, or spend time setting one up. I can trust that Github Pages will have reasonable uptime, and I won't have to monitor it at all.
> And if you already knew how to write/make HTML you'd for sure already know all of that too.
This seems unnecessarily aggressive, and I don't really understand where it's coming from.
BTW, you can absolutely host plain HTML with Github Pages. No SSG required.
That's a completely false statement. My kid took very basic programming classes in school which covered HTML so they could build webpages, which is a fantastic instant-results method. Hooray, now the class is finished, he wants to put it on the web. Just like millions of other kids who couldn't even spell FTP.
No it wasn't. Seriously, where?
a) git pull & push for updates
b) good enough CDN distribution, sometimes interactive examples/project page loads tons of files
c) good enough security promises of the entire platform/infra
d) good enough serviceable time, we do not need 99.9999SLA but better not down often
e) have generous free tier
f) great DX & UX, this one is small but small headache adds up quickly
Sounds like they're cool with a little personal website.
Hey, I’m building Monohub - as a GitHub alternative, and having private repositories is perhaps a key feature - it started as a place for me to host my own random stuff. Monohub [dot] dev is the URL. It’s quite early in development, so it’s quite rough around the edges. It has PR support though.
Hosted in EU, incorporated in EU.
Would be happy if you tried it out — maybe it’s something for you.
Edit: you can have a look at a public repository I have to see what it looks like now: https://monohub.dev/@tbayramov/efcore-audit-timestamps
Domain whois: https://monohub.dev.whoswho
(Yes, I'm still salty google broke our localhost .dev sites.)
(Yes, I know it was never a reserved TLD.)
Hmm all that operational overhead... Of an ssh server? If you literally just want a place to push some code, then that really isn't that hard.
And no, its not that hard once you learn. Except, now its a never ending chore when it was an appliance. Instead of a car you have a project car.
Can confirm.
Also, not everyone who wants to share content publicly has a domain name with which to do so, or the kind of Internet connection that allows running a server. If you include "hosting" by using a hosting provider... it's perfectly possible (raises hand) to not even have any experience with that after decades of writing code and being on the Internet. (Unless you count things like, well, GitHub and its services, anyway.)
And I believe GP was talking about the only thing you need is:
And then you can add the remote to your local repo with Now all you need to do is Replace origin with another identifier if it's already taken.then do above.
then troubleshoot.
set vm backup policy.
save myriad passwords and secret to bitwarden.
get ubuntu to stay up to date.
Just store the repo and access it with your account.
The whole git@ thing is because most "forge" software is built around a single dedicated user doing everything, rather than taking advantage of the OS users, permissions and acl system.
For a single user it's pointless. For anyone who knows how to setup filesystem permissions it's not necessary.
but if you're just looking for a generic place to put your code projects that aren't necessarily intended for public release and support (ie. random automation scripts, scraps of concepts that never really got off the ground, things not super cleaned up), they're not really for that - private repositories are discouraged according to their FAQ and are very limited (up to 100mb).
You don't need a one size fits all solution...
Until the AI scrapers[1] come for you at 5k requests per second and you're doing operations in hard-mode.
1. Most forges have http pages for discoverability. I suppose one could hypothetically setup an ssh-only forge and statically generate a html site periodically, but this is already advanced ops for the average Github user
1: https://code.storage/
Also gitlab has cves like every other week... You're going to be on that upgrade train, unless you keep access locked down (no internet access!!) and accept the admittedly lower risk of a vulnerable gitlab server on your LAN/VPN network.
Even if gitlab is updated fully, you're fighting bot crawlers 24/7.
Step 1.) Stay on GitHub
I think the internet has "GitHub Derangement Syndrome" right now. It's an outlet for people's frustration.
The current trend reminds me a lot of the couple years we had where Game Developers were that outlet. They needed to "Wake up" and not "Go woke, go broke". An incredible amount of online discourse around gaming was hijacked by toxic negativity.
I'm sure every individual has their really good logical reasons, but zooming out I think there is definitely a similar social pathology at play.
I would argue that the open source people aren't the only ones paying attention right now.
If you are hosting proprietary code on Github, it has become clear that Microsoft is going to feed that into their AI training set. If you don't want that, you don't have a choice but to leave Github.
If I just want to host my code, I can self host or use an SSH/SFTP server as a git remote, and that's usually what I do.
And so we go, forever in circles, until enough of us move to other platforms regardless of where the existing community is. Just like how GitHub found its community in the early days, when most people (afaik) was using SourceForge, if anything.
"The community" will always remain on GitHub, if everyone just upload code to where "the community" already is. If enough of us stop using GitHub by default, and instead use something else, eventually "the community" will be there too, but it is somewhat of a chicken-and-egg problem, I admit.
I myself workaround this by dropping the whole idea that I'm writing software for others, and I only write it for myself, so if people want it, go to my personal Gitea instance and grab it if you want, I couldn't care less about stars and "publicity" or whatever people nowadays care about. But I'm also lucky enough to already have a network, it might require other's to build their network on GitHub first, then also be able to do something similar, and it'll all work out in the end.
SourceForge was abandoned due to UX issues and the adware debacle; at the same time, GitHub started making changes which made it more viable to use the platform to distribute binary releases.
The deficiencies of GitHub are not critical enough for me to care, and if it ever gets that bad, pushing somewhere else and putting a few "WE HAVE MOVED" links isn't a big deal.
And "the community" isn't moving to Codeberg because Codeberg can't support "the community" without a massive scale up.
I'd say SourceForge was abandoned due to VA Linux going under. I remember the pain/panic as large numbers of OSS projects were suddenly scrambling to find alternatives to SF. I actually started a subscription to GitHub just to try and help ensure they had the money to stay in business, and we didn't have to go thru that again.
People have a superficial knowledge of the space (I think this extends beyond Codeberg) but feel strongly that they need to advocate for something. Codeberg themselves seem to have opinions about what they want to do but people are suggesting they can do more simply because it gives them an outlet.
The constraints that Codeberg set seem to, on the surface at least, ensure they can scale based on their needs and protect them from external threats. Hosting random sites comes with a range of liabilities they probably understand and want to avoid right now. There are EU regulations which can be challenging to handle.
The part of the FOSS community that embraces proprietary dependencies are there, but there’s a lot of the community outside of it.
Fortunately, GitHub is pushing hard for folks to want to move away.
It's worth $50 just this month, according to them, but I don't see anyone else offering the mac runners that account for most of it.
For all the complaints, I test my packages that actually need it across dozens of architecture and OS combinations with a mix of runners, nested virtualization and qemu binfmt, all on their free platform.
It's a shame. The people who control the money successfully committed enshittification against open source.
Mine syncs a few repos to github to make them public, and planning to add Codeberg in as well ala POSSE [2].
1. https://forgejo.org/docs/next/user/repo-mirror/
2. https://indieweb.org/POSSE
Of course, that mostly goes for projects big enough to already have an indepedent community.
I'm not trying defend github here. The largest platform could have been anyone who took advantage of the early opportunities in the space, which just happens to be Github. But discoverability is still a nagging problem. I don't think that even a federated system (using activitypub, atproto or whatever else out there) is going to solve that problem. We need a solution that can scour the entire code hosting space like search engines do (but collaboratively, not aggressively like LLM scrapers).
It's helpful to have a github mirror of your "real" repo (or even just a stub pointing to the real repo if you object to github strongly enough that mirroring there is objectionable to you).
One day maybe there will be an aggregator that indexes repos hosted anywhere. But in many ways that will be back to the square one - a single point of failure.
The Fediverse seems to dislike global search. Or is that just a mastodon thing?
Lemmy seems to do a decent enough job at global searches. It's most likely just a mastodon problem.
Word of mouth. Package managers. Search engines. Your LLM of choice.
Does anyone seriously use GitHub search to discover new projects?
Package managers: Same problem as above. You missed the point of free software.
Search engines: They do a disastrous job of indexing anything on a forge. You might as well yell at the clouds instead.
LLM of choice: I'm not taking this seriously.
> Does anyone seriously use GitHub search to discover new projects?
I don't even understand the point of such questions. None of the solutions you proposed are any better solving what I described than the insufficient method I wrote about.
[1]: https://huijzer.xyz/posts/55/installing-forgejo-with-a-separ...
[2]: https://huijzer.xyz/posts/55/installing-forgejo-with-a-separ...
I was self hosting gitlab for a long time. But forgejo is an order of magnitude less resource intensive.
It is a single very small go binary. You can use sqlite or postgres. But you can easily run it inside a small docker container on your local machine.
And it is fun to hack on it because it is so open. You build really fun workflows that are blocked by the corporate limits of Github.
Build Nix config into a VM image => Deploy VM to Proxmox via its API => Spin up Docker stack via Komodo
I've also trying to use it to sync my Obsidian vault via git to my phone, altho that flaked out on me recently (if anyone knows a reliable way to use git via the shell on iOS, please let me know).
Also now wondering about differences with forgejo.
This on its own makes me pretty bearish on community-driven attempts to oust GitHub, even if ideologically I'm aligned with them: the real cost (both financial and in terms of complexity) of user expectations around source forges in 2026 is immense.
I think the real problem is we were sold all these complex processes that supposedly deliver better results, while in reality for most people and orgs it's just cargo culting, like with Kubernetes, for example. We can get rid of 90% of them and be just fine. You easily get away without any kind of CI in teams of less than 5-7 people I would argue - just have some sane rules and make everyone follow them (like run unit tests before submitting a PR).
> what is CI (in practice today, not in theory)? It's just a set of commands that run on a remote machine and then the output of those commands is displayed in the browser and it also influences what other commands may or may not run. What exactly is the big deal here?
The key is hermetically/reproducibly - you don't want to run commands on some random machine, you want to run commands on a well-controlled/versioned machine that won't change under you, that you can recreate at will. Which sure you should be able to do with Nix or something, but the kind of person who doesn't want to use CI doesn't want to use Nix either.
And the other key is whose machine? Particularly when it comes to e.g. Macs, which are expensive. Maybe you have a spare server farm in your closet, but most people don't.
Is the compiling and test running too resource intensive?
Do you build every commit? If so, why?
I see the value in larger teams, but for solo stuff I just find it slow and annoying. I'm using go, and it compiles fast, so that could be a part of it.
I end up with all kinds of random crap on my own machine. It's very easy to accidentally e.g. globally install a library that wasn't properly listed in my dependency management. So having a separate standardised/controlled build environment is a good way to catch those. It also helps with flaky tests or random "works on my machine" problems - this way my tests are at least getting run on two quite different machines (different OS/arch/etc.)
Why not use VMs? Libvirt is scriptable enough for that. And LXC/Incus can be used if you want the shorter starting time.
None of this is hard, exactly, but you do have to put in the legwork of doing it, and it's mostly only the big players who've done so.
Creating a build machine is not rocket science.
Which is actually useful.
and thus you discover the value of CI
Good luck implementing merge queues yourself. As far as I know there are no maintained open source implementations of merge queues. It's definitely not as trivial as you claim.
Why? I know plenty of teams which are fine with repo and CI being separate tools as long as there is integration between the 2.
The whole PR and code review experience is much more important to me. Github is striving to set a high bar, but is also hilariously bad in some ways. Similarly the whole issue system is passable on Github, but doesn't really reach the state of the art of issue systems from 20 years ago
GitHub’s hosted runners support a grant total of two architectures.
The only forges which I’ve seen with more variety are distributios’ forges usually hosting their own runners.
For me its providing uptime. Github is barely reaching one nine of availability these days.
On the other hand Codeberg doesn't let you create private repositories at all. So Copilot could still legally scrape your open source Codeberg repos.
I don't see much of a point for most people. https://docs.codeberg.org/getting-started/faq/ >If you need private repositories for commercial projects (e.g. because you represent a company or are a developer that needs a space to host private freelance projects for your clients), we would highly recommend that you take a look at Forgejo. Forgejo is the Git hosting software that Codeberg runs. It is free software and relatively easy to self-host. Codeberg does not offer private hosting services.
are you sure about that? I'm fairly certain my repos on codeberg are all private but I could be mistaken.
I think Forgejo would work fine for smaller projects and teams. We really wanted to stop having to worry about GitHub going and not being able to do CD as well as get away from a lot of the action zero-days happening.
And yes, it's self-hosted and free! You can run a reference implementation pretty easily with non-production components (i.e. they won't backup or scale well).
Most of my friends who use codeberg are staunch cloudflare-opponents, but cloudflare is what keeps Gitlab alive. Fact of life is that they're being attacked non-stop, and need some sort of DDoS filter.
Codeberg has that anubis thing now I guess? But they still have downtime, and the worst thing ever for me as a developer is having the urge to code and not being able to access my remote. That is what murders the impression of a product like codeberg.
Sorry, just being frank. I want all competitors to large monopolies to succeed, but I also want to be able to do my job/passion.
> the worst thing ever for me as a developer is having the urge to code and not being able to access my remote.
Makes it seem like GitHub/Codeberg has to be online for you to be able to code, is that really the case? If so, how does that happen, you only edit code directly in the GitHub web UI or how does one end up in that situation?
I'm guessing you have SSH access between the two? You could just add it as another remote, via SSH, so you can push/pull directly between the two. This is what I do on my home network to sync configs and other things between various machines and OSes, just do `git remote add other-host git+ssh://user@10.55/~/the-repo-path` or whatever, and you can use it as any remote :)
Bonus tip: you can use local paths as git remote URLs too!
> but more than once I've lost work that way.
Huh, how? If you didn't push it earlier, you could just push it later? Some goes for pull? I don't understand how you could lose anything tracked in git, corruption or what happened?
That's what I do. Control your entire world yourself.
Sure, you might neglect to add a file to your commit, or commit at all, but that's a problem whether you're pushing to a central public git forge or not.
`ssh remote "cd $src/repo ; git diff" | git apply`
(You'll need to season to taste: what to do with staged changes, how to make sure both trees are in the same HEAD, etc)
For how infrequent I interface with Codeberg I have to say that my experience has been pretty terrible when it comes to availability.
So I guess the answer is: the availability is bad enough that even infrequent interactions with it are a problem.
I can understand that work with other active contributors, but I agree with you that it is a daft state of affairs for a solo or mostly-solo project.
Though if you have your repo online even away from the big places, it will get hit by the scrapers and you will end up with admin to do because of that, even if it doesn't block your normal workflow because your main remote is not public.
Not really. The point of git was to make Linus' job of collating, reviewing, and merging, work from a disparate team of teams much less arduous. It just happens that many of the patterns needed for that also mean making remote temporarily disconnected remote repositories work well.
--------
[1] He did a lot more than type “help” - he was essentially trying to reverse engineer the product to produce a compatible but more open client that gave access to metadata BitKeeper wanted you to pay to be able to access² which was a problem for many contributors.
[2] you didn't get the fulllest version history on the free variants, this was one of the significant concerns making people discuss alternatives, and in some high profile cases just plain refuse to touch BitKeeper at all
So what ? That's not how most people prefer to use it.
They're not. They're using it very correctly, by choosing to ignore a capability that's irrelevant to them.
> If everyone was running around using screwdriver handles to pound in nails
Stop thinking by analogy. It harms your ability to think correctly.
Even with the best habits, there will be the few times a month where you forgot to push everything up and you’re blocked from work.
Codeberg needs to meet the highest ability levels for it to be viable.
Philosophically I think it's terrible that Cloudflare has become a middleman in a huge and important swath of the internet. As a user, it largely makes my life much worse. It limits my browser, my ability to protect myself via VPNs, etc, and I am just browsing normally, not attacking anything. Pragmatically though, as a webmaster/admin/whatever you want to call it nowadays, Cloudflare is basically a necessity. I've started putting things behind it because if I don't, 99%+ of my traffic is bots, and often bots clearly scanning for vulnerabilities (I run mostly zero PHP sites, yet my traffic logs are often filled with requests like /admin.php and /wp-admin.php and all the wordpress things, and constant crawls from clearly not search engines that download everything and use robots.txt as a guide of what to crawl rather than what not to crawl. I haven't been DDoSed yet, but I've had images and PDFs and things downloaded so many times by these things that it costs me money. For some things where I or my family are the only legitimate users, I can just firewall-cmd all IPs except my own, but even then it's maintenance work I don't want to have to do.
I've tried many of the alternatives, and they often fail even on legitimate usecases. I've been blocked more by the alternatives than I have by Cloudflare, especially that one that does a proof of work. It works about 80% of the time, but that 20% is really, really annoying to the point that when I see that scren pop up I just browse away.
It's really a disheartening state we find ourselves in. I don't think my principles/values have been tested more in the real world than the last few years.
I have website small enough not to attract too many bot, but sometime, something very innocent can bring my website down.
For example, I put a php ical viewer.. and some crawler start loading the calendar page, taking up all the cpu cycle.
Mental note, make sure my robots.txt files contain a few references to slowly returning pages full of almost nonsense that link back to each other endlessly…
Not complete nonsense, that would be reasonably easy to detect and ignore. Perhaps repeats of your other content with every 5th word swapped with a random one from elsewhere in the content, every 4th word randomly misspelt, every seventh word reversed, every seventh sentence reversed, add a random sprinkling of famous names (Sir John Major, Arc de Triomphe, Sarah Jane Smith, Viltvodle VI) that make little sense in context, etc. Not enough change that automatic crap detection sees it as an obvious trap, but more than enough that ingesting data from your site into any model has enough detrimental effect to token weightings to at least undo any beneficial effect it might have had otherwise.
And when setting traps like this, make sure the response is slow enough that it won't use much bandwidth, and the serving process is very lightweight, and just in case that isn't enough make sure it aborts and errors out if any load metric goes above a given level.
A bit like this? ( iocaine is newer)
Is iocaine actually newer though? Its first commit dates to 2025-01, while the blog post is from 2025-03. I couldn't find info on when Cloudflare started theirs. There's also Nepenthes, which had its first release in 2025-01 too.
I don't buy that for a second. Those not obeying robots.txt were doing so either because they were malicious (they wanted everything and wouldn't be told “please don't plough through these bits”) or stupid (not knowing any better) or both.
Anyone who was obeying robots.txt isn't going to start ignoring it because we've put honeypots there. Why would they think “well, now there are honeypots there I'm going to go scan those… honypots, yeah, that's a good idea”.
> The other reason is that bandwidth/bots are cheap enough now that they don't need web admins to help them optimize their crawlers
Web admins are not trying to optimize their crawlers, they are trying to stop their crawlers breaking sites.
[1] https://mrshu.github.io/github-statuses/
And pretty much all of them, ByteDance, OpenAI, AWS, Claude, various I couldn't recognize. I basically just had to block all of them to get reasonable performance for a server running on a mini-pc.
I was going to move to codeberg at some point, but they had downtime when I was considering it, I'd rather deal with that myself then.
They generate a URL for every version of every file on every commit and every branch and tag, and if that wasn't enough, n(n+1)/2 git diffs for every file on every commit it has exited on. Even a relatively small git repo with a few hundred files and commit explodes into millions of URLs in the crawl frontier. Server side many of these are very expensive to generate as well so it's really not a fantastic interaction, crawler and git host.
If you run a web crawler, you need to add git host detection to actively avoid walking into them.
From the shape of the traffic it just looks like a poorly implemented web crawler. By default, a crawler that does not take measures to actively avoid git hosts will get stuck there and spend days trying to exhaust the links of even a single repo.
I do think they care about repos, and not just the code, but also how it evolves over time. I can see some use, if marginal in those traits. But if they really wanted that, I'd rather they clone my repos, I'd be totally fine with that. But i guess they'd have to deal with state, and they likely don't want to deal with that. Rather just increase my energy bill ;)
The idea that you shouldn't need a code hosting platform because git is decentralized is so out of place that it is genuinely puzzling how often it pops up.
The parent post mentionned: "the worst thing ever for me as a developer is having the urge to code and not being able to access my remote."
Emphasis one "code", not triaging issues, merging other people's branches, etc.
Besides there are tools to sync forgejo repositories including PRs and issues.
They said they want to be able to rely on their git remote.
The people responding are saying "nah, an unreliable remote is fine because you can use other remotes" which doesn't address their problem. If Codeberg is unreliable, then why use it at all? Especially for CI, issues, and collab?
Maybe a hard blocker if you are pair programming or collaborating every minute. Not really if you just have one hour to program solo.
Only exception is when people are using the code editor embedded in the "forge" but this is usually an exceptional use rather than the norm.
It doesn't remove it but doesn't make it very relevant either, because of all the tests that are necessarily done remotely and can't be done locally, and without that feedback in many cases development is not possible.
Well, Codeberg doesn't have all the features I did use of Gitlab, but for my own projects I don't really need them either.
Thank God GitHub is... oh.
https://mrshu.github.io/github-statuses/
I think that's the moment when you choose to self host your whatever git wrapper. It really isn't that complicated to do and even allows for some fun (as in cheap and productive) setups where your forge is on your local network or really close to your region and you (maybe) only mirror or backup to a bigger system like Codeberg/GitHub.
In our case, we also use that as an opportunity to mirror OCI/package repositories for dependencies we use in our apps and during development so not only builds are faster but also we don't abuse free web endpoints with our CI/CD requests.
Been working on it for months now, it does work, lol.
I've also been very happy with sourcehut for most of my personal projects for some time. The email patch submission workflow is a tad bit unfamiliar for most, but IMO in today's era raising that barrier to entry is mostly a good thing for OSS projects.
I also strongly prefer a simple CI environment (where you just run commands), which encourages you to actually be able to run your CI commands locally.
The underlying protocol (git) already has the cryptographic primitives that decouples trust in the commit tree (GPG or SSH signing) with trust in the storage service (i.e. github/codeberg/whatever).
All you need to house centrally is some SSH and/or gpg key server and some means of managing namespaces which would benefit from federation as well.
You'd get the benefits of de-centralisation - no over-reliance on actors like MS or cloudflare. I suppose if enough people fan out to gitlab, bitbucket, self hosting, codeberg, you end up with something that organically approximates a formally decentralised git repo system.
Hence Tangled and ForgeFed (which I believe is integrating in Forejo)
I think at this point the bigger barrier to me with leaving GitHub (professionally, at least) is all the non-GitHub stuff that integrates nicely with it and badly or not at all with other solutions. And like, I don't blame tool providers for making a rational economic choice in that regard, but if leaving GitHub means leaving seamless Sentry, Depot, Linear, editor plugins, AI integrations, etc that makes it a tougher pill to swallow.
I worked for years at a shop that had in-house GitLab and we felt this pain first hand all the time. GitLab tries to be a one-stop shop and own the whole project management and testing/deployment workflow by building everything in house, but there were always gaps and it was hard not to be jealous of places that just did everything on GitHub and could use whatever best in class saas stuff they wanted.
Gitlab has been tracking a federation feature since at least 2018 [1], and I expect bitbucket, sourcehut, gitea, and others would move quickly on something like this as well, but there needs to be a protocol defined and some kind of plan for handling spam/abuse.
[1]: https://gitlab.com/groups/gitlab-org/-/work_items/16514
[1] https://github.com/git-bug/git-bug
It is a single binary and I think it is also very light on resources. At least compared to gitlab.
I wonder why they dont just offer unlimited private repos for (reasonably) paid accounts , I think maybe a 40 dollar per year (or 4 dollar monthly), is low and encouraging , and should be welcomed by many , I hope they consider it
I imagine they would argue that private repositories do not follow this purpose, as they are neither free content nor FOSS. I believe you could argue that charging a modest fee for private repositories to finance the hosting of FOSS repositories is in line with the purpose, but you get on thinner ice with that. It could quickly make them appear more like a company than like a nonprofit
The goal is to get at least a % available on CB, then we can think about where the community is
Forgejo does support mirrors, just not codeberg.
How has reliability been so far compared to GitHub, especially on bigger repos?
Last commit is two years ago, but still a good overview.
So I'll wait until some big projects do it. For example, the day the Rust project moves away from GitHub I'll seriously consider. (Because I use Rust)
Also radicle.xyz
> You could tell Codeberg to push new commits to GitHub, but this allows users to still file PRs and comment on issues and commits 2. Some folks have dealt with this by disabling issues on the GitHub repo, but that is a really destructive action as it will 404 all issues, and pull requests cannot be disabled. Some repos like libvirt/libvirt have written a GitHub Action that automatically closes all pull requests.
You can limit them to contributors as well.
Here you go: https://openheart.fyi
This is the only reason I haven’t migrated yet (I keep a mirror[1]).
1 - https://codeberg.org/willtobyte/carimbo
This was my biggest blocker as well, as there weren't any managed CIs that supported Codeberg until recently.
NixCI[0] recently added support for Codeberg, and I've had a great experience with it. The catch is that you have to write your CI in Nix, though with LLMs, this is actually pretty easy. Most of my CI jobs are just bash scripts with some Nix wiring on top.[1] It also means you can reproduce all your CI jobs locally without changing any code.
[0] https://nix-ci.com
[1] https://codeberg.org/mtlynch/little-moments/src/commit/d9856... - for example
I am honestly shocked that they managed to survive and thrive after getting bought by MS! So please pardon my ignorance when I say that I don’t understand why some FOSS projects want to move away.
The sheer scale of GitHub and its economic backing by giants allows it to offer a lot, for FREE.
Self hosted projects cannot compete with this. They either get huge venture fund or remain small.
GitHub is not perfect, but nothing comes close
Source: https://bsky.app/profile/teropa.bsky.social/post/3mi2dbt27m2...
I really wish there was a way to support with them a smaller amount then €24. I dont use codeberg myself but I really want to support them.
Wire transfer is €10
Stripe is €5
With PayPal you can send €0.01 if you want
Or Liberapay, as little as €0.01 per week
I have abandoned github and even gitlab for all intents and purposes. But there's another side to consider in this.
It's always risky for the FOSS community to depend on a service that doesn't offer interoperability and freedom of migration. Ironically, Github is such a service built on a tool (git) that's built for maximum interop and migration. But the popularity of Github among the developer community isn't an accident. They worked really hard during their early stages as a startup, to gain the trust of community. Nobody foresaw Microsoft buying them at that stage (though you should really just assume that it would happen eventually).
The reluctance of a lot of them to abandon the platform can be attributed to lack of principles - IF it was an isolated incident. But we see the same story repeating with several development platforms. NPM is an example. PyPI and crates.io are still independent, as far as I know. But they aren't free of corporate influences either. No matter how much we try to avoid them, the companies just buy their way into these platforms when they become popular enough. I'm not happy with this. But I don't know a solution either.
An overly ideological PoV can make it easy to overlook that some people are simply on Github from a practical standpoint. I myself host Forgejo and moved a lot of stuff there. I don't really find a good reason to host anything on Codeberg, yet. Github still offers me a nice set of repos to find via the people I follow there.
Don't keep all your eggs in one basket.
I want to pay for CI on my Codeberg projects, but I've been struggling to find something where I can just pay by the minute. I have projects that benefit from large CI runners but my usage is low enough that it makes no sense to host my own.
[1]: https://codefloe.com/
[2]: https://news.ycombinator.com/item?id=47487436
Yup and this is where I pass on anything other than GitHub.
I use Namespace (https://namespace.so) and I hook it up both to my personal GitHub as well as my personal Forgejo. I’m in the process of moving from the former to the latter!
Granted, self-hosting git is not feasible for everyone, but GitHub + self hosted runners seems like a very good option.
I assume this isn't optimal for a business setup, but for personal projects, I don't miss GitHub Actions at all.
It seems like to be a serious CI platform they really need to change Windows and Mac binaries for runners so you can build for those platforms.
And this is more of a Forgejo issue than a Codeberg issue specifically.
But also, I’d also throw out there the idea that CI doesn’t have to be at the same website as your source control. It’s nice that GitHub actions are conveniently part of the product but it’s not even really the top CI system out there.
A contributor maintains a tested re-release of Forgejo Runner for Windows: https://github.com/Crown0815/Forgejo-runner-windows-builder
But, pull it down and build it, and it will work.
I can't imagine using GitHub without Octobox; it's just impossible to keep track of all the notifications by email.
Unfortunately, Octobox doesn't support GitHub, so I've no idea how to follow projects, even the ones I really want to contribute to.
Can I link a codeberg repo to Railway for example?
microsoft carefully broke classic web support overtime, THX AGAIN MICROSOFT, WE LOVE YOU!
Now they are turning GitHub into a canteen for AI agents and their AI chatbots (Copilot, Tay.ai and Zoe) to feed them on your code if you don't opt out.
> The by far nastiest part is CI. GitHub has done an excellent job luring people in with free macOS runners and infinite capacity for public repos
Hosting was never free and if you do not want Codeberg to go the way of GitHub, you need to pay for it.
Otherwise expect GitHub downtime to hit every week or so.
[0] https://news.ycombinator.com/item?id=22867803
But that's the most important part. A repository without CI is basically dead.
The biggest challenge of this era is automated verification, and proper CI infrastructure is essential.
GitHub feels like what Hudson/Jenkins was some decades ago. Horrible, but the only one that did what it did.
I run probably hundreds of dollars of CI on GitHub per month. Except I don't pay a cent for it (all open source public repos). I can't just let that go, those workers do real work.
Hosted in Europe, we welcome the world.
```````
so it's you control, make money vs they control make money. what is the difference here , except some eu version of maga movement here?
> maga movement
Codeberg is just a hosted instance of Forgejo (GPLv3).
They even support a workflow for migrating to a different Forgejo instance [1].
[1] https://docs.codeberg.org/advanced/migrating-repos/