??changed: -The new hardware has eth0 (leftmost when looking from back) orange loopback cable to "old" savannah, eth1 (rightmost looking from back) black cable to port 22 of ge-sw1.qcy. The MACs end with :2c and :2d but there seem to be a confusion/swap at boot time (e.g. pxeboot), so don't rely on it. PXE boot need to be enabled by a FSF sysadmin (by switching VLANs). Danny thinks CD-ROM boot won't work with this PXE boot version. Type F12 and select ``5. gPXE (PCI 00:09.0)`` to PXE-boot. The new hardware has eth0 (leftmost when looking from back) orange loopback cable to "old" savannah, eth1 (rightmost looking from back) black cable to port 22 of ge-sw1.qcy. The !MACs end with :2c and :2d but there seem to be a confusion/swap at boot time (e.g. pxeboot), so don't rely on it. PXE boot need to be enabled by a FSF sysadmin (by switching !VLANs). Danny thinks CD-ROM boot won't work with this PXE boot version. Type F12 and select ``5. gPXE (PCI 00:09.0)`` to PXE-boot.
??changed: -Here are the subsystems/guests/DomUs we currently use:: Here are the subsystems/guests/!DomUs we currently use:: ++added: * first, the principal host is colonialone.fsf.org (not a virtual server) where admins can log in. There's a disabled Apache with a "Savannah downtime" frontpage. * builder: used to rebuild software, Debian packages or kernels. It's separate so there's no problem with installing dependencies, testing them, etc. There's a pbuilder (cowbuilder) ready for Etch+backports rebuilds. ??changed: -* vcs-noshell: public services that are accessed through a restricted shell: CVS, Git, SVN, Hg, with their respective web browsing apps -* sftp: public services that are accessed through a restricted shell and allow users to write files anywhere they want (using SFTP/SCP/rsync): download, bzr, GNU Arch, and an Apache instance to access them -* internal: runs internal services, like the MySQL database, the Exim MTA (with From: and sender rewrite), a Debian mirror. There's also a public Cacti instance there. -* builder: used to rebuild software, Debian packages or kernels. It's separate so there's no problem with installing dependencies, testing them, etc. There's a pbuilder (cowbuilder) ready for Etch+backports rebuilds. -* and of course, the host colonialone.fsf.org (not a virtual server) where admins can log in. There's a disabled Apache with a "Savannah downtime" frontpage. - -Access Xen DomUs using 'ssh [email protected]'. You need to run this from the Dom0, because some of the guests are only accessible through internal IPs. * internal: runs internal services, like the !MySQL database, the Exim MTA (with From: and sender rewrite), a Debian mirror. There's also a public Cacti instance there. * sftp (aka dl.sv.gnu.org): public services that are accessed through a restricted shell and allow users to write files anywhere they want (using SFTP/SCP/rsync): download, bzr, GNU Arch, and an Apache instance to access them. * vcs-noshell (aka vcs in ssh): public services that are accessed through a restricted shell: CVS, Git, SVN, Hg, with their respective web browsing apps. Access these Xen !DomUs using 'ssh [email protected]'. You can also just the short names, courtesy of ~/.ssh/config on colonialone. You need to run this from the Dom0, because some of the guests are only accessible through internal !IPs. ??changed: -A: No. !VServers could share a single filesystem, which was convenient, -but Xen have all virtual machines on separate filesystems (as it A: No (unfortunately). !VServers could share a single filesystem, which was convenient, but Xen has all virtual machines on separate filesystems (as it ??changed: -You can log from colonialone to any virtual machine, even internal You can log in from colonialone to any virtual machine, even internal -- forwarded from http://savannah.gnu.org/maintenance/SavannahArchitecture#[email protected]/maintenance _______________________________________________ Savannah-cvs mailing list [email protected] http://lists.gnu.org/mailman/listinfo/savannah-cvs
