On Wednesday, 22 August 2018 at 15:17:36 UTC, Jesse Phillips wrote:
It is weird that you make loosing current and historical pull requests is minor

It would be disruptive. However, work could resume rather quickly.

The disruption would be reduced if we had a periodic job set up to mirror github pull requests. There is at least one export tool, but it seems to get only the titles and not the comments or patches.

compared to:

* Having all the data readily available for search engines to have archived (today, not tomorrow). * Having an established forum/newsgroup readily available to handle the load of new questions.

I just don't see data retention and recovery for StackOverflow to be a concern for making such a choice. Even if it did take weeks or months to host the historical data, risk should be weighed against possible benefit from visibility and growth from heavily using StackOverflow.

And similarly, the choice of Github instead of a self-hosted system is weighed against requiring people to sign up with a private gitlab instance. Also similarly, the disruption would be reduced if we had a periodic job set up to handle long-term stackoverflow unavailability in advance.

I'm a little paranoid about centralized services like Github. I'd prefer a federated service for source control / project management, where you could easily fork projects from my server to yours and send back pull requests. Then there would be no extra cost for hosting your own vs using an existing instance.

I've been low-key thinking about making a federated github, one where exporting your data is as simple as a `git clone; git submodule update --init`. Probably nothing will come of it, though.

Reply via email to