On Wed, 2010-11-24 at 22:49 +0000, Daniel Drake wrote:
> On 24 November 2010 22:40, Kevin Gordon <kgordon...@gmail.com> wrote:
> > Is this recommendation against yum and rpm for all software, or just the
> > oplc repo packages, the kernel and the firmware?  I'm certainly happy doing
> > just safe builds for the core.
> 
> To avoid all corner cases it the recommendation really needs to be 
> "everything"
> In reality, you'll probably get away with it, especially because
> you're only really working with added packages in your deployment (not
> upgrading ones that are already installed).
> 
> Some of the resultant problems will also not affect small deployments
> like yours. For example, one side effect is that olpc-update pristine
> (efficient) updates stop working as soon as you make any filesystem
> modifications like this. Another side effect is that your
> custom-installed packages will magically disappear after an
> olpc-update upgrade (which in a real deployment would happen without
> you even knowing).
> 
> But in a small deployment like yours, touching each laptop for updates
> is probably more sensible than the knowledge and infrastructure
> investment needed for hands-off olpc-update, so you aren't affected.
> 
> > However, as part of our 'refresh' stick when we wipe and install a new
> > signed build, we generally also include the necessary rpm's for cheese and a
> > couple of other utilities that are locally installed from the USB stick
> > using a bash script; or, for the Vernier software dependencies, the
> > dependent rpm's are installed by means of a python script.  However, they
> > are rpm's and they are downloaded onto the stick (the first time) using yum,
> > and they are then installed from the stick using --localinstall from the
> > stick.
> 
> You probably won't see any problem with this collection of changes.
> Nevertheless, at the SF summit I started showing Adam the "correct"
> way to do this: building a custom OS image with those customizations
> already included. We didn't have time to completely finish it, but he
> picked it up quickly and could probably finish it with a little effort
> (and perhaps a couple of mails to this list).

Having os-builder required to have net access in ksmain.50.repos.py is
less than ideal for remote image creation. Once the cache is downloaded
could we not just run createrepo on the cache and point os-builder to
the local url instead of going out to the net all the time? something
like:
 
if use_cache:   
            url = "file:///%s/imgcreate/%s" %(ooblib.cachedir, name)
        else:
            url = "http://mock.laptop.org/repos/%s"; % name

Attached is a rough diff of what I have in mind.

On a side note both 10.1.2 and 10.1.3 share the the same olpc repo via 
os-builder
(http://xs-dev.laptop.org/~dsd/repos/), that makes it harder to tell the break 
between the two.  Now it's impossible to re-spin os852 without hard-coding the 
rpm versions elsewhere in os-builder. Just a thought.    


Jerry



diff -r -u olpc-os-builder-1.2.0/modules/base/defaults.ini olpc-os-builder-au/modules/base/defaults.ini
--- olpc-os-builder-1.2.0/modules/base/defaults.ini	2010-08-31 16:22:47.000000000 -0500
+++ olpc-os-builder-au/modules/base/defaults.ini	2010-11-22 14:47:49.000000000 -0600
@@ -3,7 +3,8 @@
 olpc_version_major=0
 olpc_version_minor=0
 olpc_version_release=0
-target_platform=XO
+target_platform=XO-1.5
+use_cache=0
 
 ; please don't set official=1 unless you really are OLPC
 ; we're just trying to reduce confusion by ensuring that builds that are made
diff -r -u olpc-os-builder-1.2.0/modules/repos/ksmain.50.repos.py olpc-os-builder-au/modules/repos/ksmain.50.repos.py
--- olpc-os-builder-1.2.0/modules/repos/ksmain.50.repos.py	2010-08-31 16:22:47.000000000 -0500
+++ olpc-os-builder-au/modules/repos/ksmain.50.repos.py	2010-11-22 14:47:51.000000000 -0600
@@ -9,9 +9,15 @@
 from StringIO import StringIO
 
 def add_to_excludes(url, addexcludes):
-    url = "%s/repodata/primary.xml.gz" % url
     print >>sys.stderr, "Reading package information from", url
-    fd = urllib2.urlopen(url)
+    if url.startswith("http" or "ftp"):
+        fd = urllib2.urlopen("%s/repodata/primary.xml.gz" % url)
+    else: 
+        if url.startswith("file://"):
+            url = url[7:]
+            if not os.access("%s/repodata/primary.xml.gz" % url, os.R_OK):  
+                os.system('createrepo -d --update %s' % url)        
+        fd = open("%s/repodata/primary.xml.gz" % url)
     data = fd.read()
     fd.close()
     fd = GzipFile(fileobj=StringIO(data))
@@ -21,6 +27,7 @@
 addexcludes = ooblib.read_config('repos', 'add_excludes_to')
 fedora = ooblib.read_config('repos', 'fedora')
 fver = ooblib.read_config('global', 'fedora_release').strip()
+use_cache = ooblib.read_config_bool('global','use_cache')
 
 # clean up addexcludes list
 if addexcludes is not None:
@@ -34,18 +41,25 @@
 
 # cycle over all 3 repos types, adding them to repos
 # add things to exclude list on-the-fly
+
 for key, value in os.environ.iteritems():
     if key.startswith("CFG_repos__olpc_frozen_"):
         for_excludes, name = value.split(',', 1)
         for_excludes = int(for_excludes)
-        url = "http://mock.laptop.org/repos/%s"; % name
+        if use_cache:   
+            url = "file:///%s/imgcreate/%s" %(ooblib.cachedir, name)
+        else:
+            url = "http://mock.laptop.org/repos/%s"; % name
         if for_excludes:
             add_to_excludes(url, excludepkgs)
         repos[name] = ("baseurl", url)
     elif key.startswith("CFG_repos__olpc_publicrpms_"):
         for_excludes, name = value.split(',', 1)
         for_excludes = int(for_excludes)
-        url = "http://xs-dev.laptop.org/~dsd/repos/%s"; % name
+        if use_cache:   
+            url = "file:///%s/imgcreate/%s" %(ooblib.cachedir, name)
+        else:
+            url = "http://xs-dev.laptop.org/~dsd/repos/%s"; % name
         if for_excludes:
             add_to_excludes(url, excludepkgs)
         repos[name] = ("baseurl", url)
@@ -55,6 +69,7 @@
         if for_excludes:
             add_to_excludes(url, excludepkgs)
         repos[name] = ("baseurl", url)
+        print >>sys.stderr, "Reading package information from", url
 
 if fedora is not None:
     for repo in fedora.split(','):
@@ -78,5 +93,3 @@
     if len(excludepkgs) > 0 and key in addexcludes:
         sys.stdout.write(" --excludepkgs=%s" % ','.join(excludepkgs))
     sys.stdout.write("\n")
-
-
diff -r -u olpc-os-builder-1.2.0/modules/sugar_activities_extra/kspost.60.nochroot.activities.sh olpc-os-builder-au/modules/sugar_activities_extra/kspost.60.nochroot.activities.sh
--- olpc-os-builder-1.2.0/modules/sugar_activities_extra/kspost.60.nochroot.activities.sh	2010-08-31 16:22:47.000000000 -0500
+++ olpc-os-builder-au/modules/sugar_activities_extra/kspost.60.nochroot.activities.sh	2010-11-22 14:47:51.000000000 -0600
@@ -3,6 +3,12 @@
 
 . $OOB__shlib
 
+use_cache=$(read_config global use_cache)
+
+if [ $use_cache = "1" ]; then
+    exit
+fi
+
 cache=$cachedir/activities
 
 oIFS=$IFS
diff -r -u olpc-os-builder-1.2.0/modules/sugar_activity_group/kspost.60.nochroot.activities.py olpc-os-builder-au/modules/sugar_activity_group/kspost.60.nochroot.activities.py
--- olpc-os-builder-1.2.0/modules/sugar_activity_group/kspost.60.nochroot.activities.py	2010-08-31 16:22:47.000000000 -0500
+++ olpc-os-builder-au/modules/sugar_activity_group/kspost.60.nochroot.activities.py	2010-11-22 14:47:51.000000000 -0600
@@ -167,8 +167,7 @@
 if not os.path.exists(cache):
     os.makedirs(cache)
 
-
-
+use_cache = ooblib.read_config_bool('global','use_cache')
 baseurl = ooblib.read_config('sugar_activity_group', 'url')
 install_activities = ooblib.read_config_bool('sugar_activity_group',
                                              'install_activities')
@@ -176,57 +175,67 @@
                                      'activity_group_systemwide')
 
 if install_activities:
-    vmaj = int(ooblib.read_config('global', 'olpc_version_major'))
-    vmin = int(ooblib.read_config('global', 'olpc_version_minor'))
-    vrel = int(ooblib.read_config('global', 'olpc_version_release'))
-
-    suffixes = ["%d.%d.%d" % (vmaj, vmin, vrel), "%d.%d" % (vmaj, vmin), ""]
-
-    for suffix in suffixes:
-        if len(suffix) > 0:
-            grpurl = urlparse.urljoin(baseurl + "/", urllib.quote(suffix))
-        else:
-            grpurl = baseurl
-
-        print >>sys.stderr, "Trying group URL", grpurl
-        try:
-            name, desc, results = parse_url(grpurl)
-        except urllib2.HTTPError, e:
-            if e.code == 404:
-                continue
-            raise e
-        if len(results) == 0 or (name is None and desc is None):
+    if use_cache:
+        print >>sys.stderr, "Not downloading, using cache."
+        for name in os.listdir(cache):  
+            print >>sys.stderr, "Found activity:", name
+            localpath = os.path.join(cache, name)
+            generate_install_cmd(localpath)
             continue
-        print >>sys.stderr, "Found activity group:", name
-
-        for name, info in results.items():
-            (version, url) = only_best_update(info)
-            print >>sys.stderr, "Examining", name, "v%d" % version
-            fd = urllib2.urlopen(url)
-            headers = fd.info()
-            if not 'Content-length' in headers:
-                raise Exception("No content length for %s" % url)
-            length = int(headers['Content-length'])
-            path = urlparse.urlsplit(fd.geturl())[2]
-            path = os.path.basename(path)
-
-            localpath = os.path.join(cache, path)
-            if os.path.exists(localpath):
-                localsize = os.stat(localpath).st_size
-                if localsize == length:
-                    print >>sys.stderr, "Not downloading, already in cache."
-                    generate_install_cmd(localpath)
+    else:          
+        vmaj = int(ooblib.read_config('global', 'olpc_version_major'))
+        vmin = int(ooblib.read_config('global', 'olpc_version_minor'))
+        vrel = int(ooblib.read_config('global', 'olpc_version_release'))
+
+        suffixes = ["%d.%d.%d" % (vmaj, vmin, vrel), "%d.%d" % (vmaj, vmin), ""]
+
+        for suffix in suffixes:
+            if len(suffix) > 0:
+                grpurl = urlparse.urljoin(baseurl + "/", urllib.quote(suffix))
+            else:
+                grpurl = baseurl
+
+            print >>sys.stderr, "Trying group URL", grpurl
+        
+            try:
+                name, desc, results = parse_url(grpurl)
+            except urllib2.HTTPError, e:
+                if e.code == 404:
                     continue
+                raise e
+            if len(results) == 0 or (name is None and desc is None):
+                continue
+            print >>sys.stderr, "Found activity group:", name
 
-            print >>sys.stderr, "Downloading (%dkB)..." % (length/1024)
-            localfd = open(localpath, 'w')
-            localfd.write(fd.read())
-            fd.close()
-            localfd.close()
-            generate_install_cmd(localpath)
+            for name, info in results.items():
+                (version, url) = only_best_update(info)
+                print >>sys.stderr, "Examining", name, "v%d" % version
+                fd = urllib2.urlopen(url)
+                headers = fd.info()
+                if not 'Content-length' in headers:
+                    raise Exception("No content length for %s" % url)
+                length = int(headers['Content-length'])
+                path = urlparse.urlsplit(fd.geturl())[2]
+                path = os.path.basename(path)
+
+                localpath = os.path.join(cache, path)
+                if os.path.exists(localpath):
+                    localsize = os.stat(localpath).st_size
+                    if localsize == length:
+                        print >>sys.stderr, "Not downloading, already in cache."
+                        generate_install_cmd(localpath)
+                        continue
+
+                print >>sys.stderr, "Downloading (%dkB)..." % (length/1024)
+                localfd = open(localpath, 'w')
+                localfd.write(fd.read())
+                fd.close()
+                localfd.close()
+                generate_install_cmd(localpath)
 
         # only process the first working URL
-        break
+            break
+
 
 if systemwide:
 	print "mkdir -p $INSTALL_ROOT/etc/olpc-update"
_______________________________________________
Devel mailing list
Devel@lists.laptop.org
http://lists.laptop.org/listinfo/devel

Reply via email to