Lee,

I tried the same thing, trying to integrate a full workflow in Python for r.sun (see attached code).   I am using a 2x2 meter lidar elevation map with about 120 million cells, so automation is pretty necessary.  The whole thing takes 3 weeks to run using 8 CPU cores and 12GB ram.   

However, r.horizon does not work properly on Grass 6.4.2.   I can only get it to run on one core in sequence.  I cannot get it to accept a single angle and run properly.   The output is always for angle 0, both in name and output data.  Bug?   I was planning to post this to the dev list if no one responds here.

It would be worthwhile to find out if your version is simply overwriting the previous, similarly named map.   If it is, and the maps are actually different (r.info), then you can change your code to run r.horizon for a single angle, then rename the map.   Something like this:

       for azimuth in range(0,361,step_size):
               grass.run_command("r.horizon", elevin=dem, \
                                 direction=angle, maxdistance=maxdistance, \
                                 horizon=prefhor, dist=dist, overwrite = 1)
               az = str(azimuth).zfill(3)
               tempmap = prefhor+'_0'
               horizonmap = prefhor+'_'+az
               cmd = tempmap+","+horizonmap
               grass.run_command("g.rename",rast=cmd)        

r.sun does work in multiple cores.       

Collin Bode
Project Manager, Desktop Watershed Integrated Program
National Center for Earth-surface Dynamics
University of California, Berkeley
http://www.nced.umn.edu
#!/usr/bin/env python
############################################################################
#
# MODULE:       2_rsun.py
# AUTHOR:       Collin Bode, UC Berkeley                March, 2011
#               Converted from BASH script to Python    March, 2012
#
# PURPOSE:      Run the entire sequence of processing for the r.sun model.
#               Threads a process per core for multicore processors (poor
#               man's multithreading).
#
# SOURCE MAPS:  bare-earth dem and canopy dem.  all else is calculated from them.      
#
# COPYRIGHT:    (c) 2012 Collin Bode
#               This program is free software under the GNU General Public
#               License (>=v2). Read the file COPYING that comes with GRASS
#               for details.
#
#############################################################################

#%Module
#% description:
#% keywords:
#% keywords:
#%End
#%option

import sys
import os
import shutil
import re
import datetime as dt
import multiprocessing as mp
gisbase = os.environ['GISBASE'] = "/usr/lib64/grass-6.4.2"
gisdbase = os.path.abspath("/data/rsun/")
sys.path.append(os.path.join(os.environ['GISBASE'], "etc", "python"))
import grass.script as grass
import grass.script.setup as gsetup
import numpy
from scipy.interpolate import interpolate

# SECTIONS TO RUN
# R1-3 allow you to turn off sections (1=yes, 0=no)
R1 = 0  # Preprocessing
R2 = 0  # Horizon calculation <-- AS FAR AS I CAN TELL THIS IS USELESS. CAN'T MULTIPROC R.HORIZON
R3 = 1  # Solar model run (r.sun)

# GENERAL PARAMETERS
P = "sfk"                       # location prefix
C = "2"                        # cell size. "2 or 10"
location = P+C+'m'              # GRASS map location
default_mapset = 'PERMANENT'    # GRASS default mapset
JulianDays = range(5,366,7)

# R2 Horizon Parameters
mhorizon = 'horizon_b10k'            # horizon mapset
maxdistance = '10000'           # maxdistance = 10000 meters (longer than the diagnal of the map)
hstep = '1'                      # horizonstep = 1 degree (causing 360 maps to be created)
dist = '0.5'                    # normal range (0.5 - 1.5) previous runs used 0.3 ?artifacting?
                                # dist=1.0 or greater uses simplified calculation that causes artifacts

# R3 Solar Model Parameters
linke_array = "ltm1"          # various options of turbidity values
msun = 'sunb8k'+linke_array       # r.sun output mapset
start_day = 5                   # First Julian Day calculated
week_step = 7                   # run r.sun once every week
timestep = '0.1'                # 0.1 = 6 minute timestep, default 0.5(30min), last run 0.5
numpartitions = 2

# MAP NAMES
loc = location
pref = 'b8k'+C+'m'             # map prefix - location & cell size
dem = loc + 'dem'              # source map: bare-earth dem
can = loc + 'can'              # source map: canopy dem
sloped = dem + 'slope'          # slope, bare-earth
slopec = can + 'slope'          # slope, canopy
aspectd = dem + 'aspect'        # aspect, bare-earth
aspectc = can + 'aspect'        # aspect, canopy
vegheight = loc + 'vegheight'  # vegetation height
albedo = loc + 'albedo'        # albedo by vegtype
demhor = dem + 'hor'            # horizon, bare-earth
canhor = can + 'hor'            # horizon, canopy


def preprocessing(mapset):
        gsetup.init(gisbase, gisdbase, location, mapset)
	
        # Slope and Aspect
        grass.run_command("r.slope.aspect", elevation=dem, slope=sloped, \
                          aspect=aspectd, prec="float", zfactor=1, overwrite=1)
        grass.run_command("r.slope.aspect", elevation=can, slope=slopec, \
                          aspect=aspectc, prec="float", zfactor=1, overwrite=1)

        # Vegetation height
        grass.mapcalc("$vegheight = $can - $dem", overwrite = 1, \
                      vegheight = vegheight, can = can, dem = dem)

        # Albedo
        grass.run_command("r.recode",flags="a",input=vegheight,output=albedo,\
                           rules=gisdbase+'/scripts/albedo_recode.txt', overwrite=1)
"""
def worker_horizon(cpu,dist,maxdistance,demh):  # doesn't work properly
        mtemp = 'temp'+str(cpu).zfill(2)
        gsetup.init(gisbase, gisdbase, location, mtemp)
        # step_size should = # cpu cores
        step_size = cores
        angle = 340+cpu
        prefhor = demh+'hor'
        demh = demh+'@PERMANENT'
        for azimuth in range(angle,361,step_size):
                grass.run_command("r.horizon", elevin=demh, \
                                  direction=angle, maxdistance=maxdistance, \
                                  horizon=prefhor, dist=dist, overwrite = 1)
                az = str(azimuth).zfill(3)
                tempmap = prefhor+'_0'
                horizonmap = prefhor+'_'+az
                cmd = tempmap+","+horizonmap
                grass.run_command("g.rename",rast=cmd)        
"""

def worker_horizon(ntemp, demh):
        mtemp = 'temp'+str(ntemp).zfill(2)
        gsetup.init(gisbase, gisdbase, location, mtemp)
        prefhor = demh+'hor'
        demh = demh+'@PERMANENT'
        grass.run_command("r.horizon", elevin=demh, horizonstep=hstep, dist=dist, \
                maxdistance=maxdistance, horizon=prefhor, overwrite = 1)

def worker_sun(cpu,julian_seed,step,demr):
        mtemp = 'temp'+str(cpu).zfill(2)
        gsetup.init(gisbase, gisdbase, location, mtemp)
        # remove g.region for normal runs
        #grass.run_command("g.region", n=4400220.4, s=4395220.4, e=447761.8, w=442761.8) # b5k
        grass.run_command("g.region", n=4401000.00, s=4393000.00, e=450000.00, w=442000.00) #b8k
        #grass.run_command("g.region", n=4401000.00, s=4391000.00, e=450000.00, w=440000.00) # b10k

        # Input Maps
        elevdem = loc+demr+'@PERMANENT'
        horr = loc+demr+'hor'
        
        # Run r.sun for each week of the year 
        for doy in range(julian_seed,366,step):
                day = str(doy).zfill(3)
                linke = linke_interp(doy,linke_array)
                # Output maps
                beam = pref+demr+day+'beam'
                dur = pref+demr+day+'dur'
                diff = pref+demr+day+'diff'
                refl = pref+demr+day+'refl'
                glob = pref+demr+day+'glob'
                grass.run_command("r.sun", flags="s", elevin=elevdem, albedo=albedo, \
                                  horizon=horr, horizonstep=hstep, \
                                  beam_rad=beam, insol_time=dur, \
                                  diff_rad=diff, refl_rad=refl, \
                                  glob_rad=glob, day=doy, step=timestep, \
                                  lin=linke, overwrite=1)	#  ,numpartitions=numpartitions

def create_temp(cores):
	gsetup.init(gisbase, gisdbase, location, 'PERMANENT')
	for count in range(0,cores,1):
		temp = 'temp'+str(count).zfill(2)
		grass.run_command("g.mapset",flags="c", mapset=temp, quiet=1)

def remove_temp(cores):
	# setup is in the target mapset, not temp mapsets
	gsetup.init(gisbase, gisdbase, location, mapset)
        # Delete the temp mapset
	for count in range(0,cores):
		mapset_temp = 'temp'+str(count).zfill(2)
                grass.run_command("g.mapsets", removemapset=mapset_temp)
                temp_path = gisdbase+'/'+location+'/'+mapset_temp
                shutil.rmtree(temp_path)

def copy_mapset(mapset_from,mapset_to,regfilter,overwrite):
        # list contents of temp mapset
        grass.run_command("g.mapset", mapset=mapset_from, quiet=1)
        raster_list = grass.list_pairs(type = 'rast')
        # Switch to target mapset and copy rasters over
        grass.run_command("g.mapset", mapset=mapset_to,quiet=1)
        for rast in raster_list:
                if(rast[1] != 'PERMANENT' and re.search(regfilter,rast[0])):
                        old = rast[0]+ '@' + rast[1]
                        new = rast[0] 
                        cmd = old+","+new
                        grass.run_command("g.copy", rast=cmd, overwrite=overwrite)
        
def linke_interp(day,turb_array):
        # put monthly data here
        # Angelo area LT from helios - cab
        # ltm1 and angelo-1 are identical, kept for backwards compatibility.  They are helios - 1
        if turb_array == 'helios':
                linke_data = numpy.array ([3.2,3.2,3.2,3.4,3.7,3.8,3.7,3.8,3.5,3.4,3.1,2.9])
        elif turb_array == 'angelo80':
                linke_data = numpy.array ([2.56,2.56,2.56,2.72,2.96,3.04,2.96,3.04,2.80,2.72,2.48,2.32])
        elif turb_array == 'angelo70':
                linke_data = numpy.array ([2.3,2.3,2.3,2.5,2.7,2.8,2.7,2.8,2.6,2.5,2.3,2.1])
        elif turb_array == 'angelo-1':
                linke_data = numpy.array ([2.2,2.2,2.2,2.4,2.7,2.8,2.7,2.8,2.5,2.4,2.1,1.9])
        elif turb_array == 'ltm1':
                linke_data = numpy.array ([2.2,2.2,2.2,2.4,2.7,2.8,2.7,2.8,2.5,2.4,2.1,1.9])
        else:
                linke_data = numpy.array ([1.5,1.6,1.8,1.9,2.0,2.3,2.3,2.3,2.1,1.8,1.6,1.5])

        linke_data_wrap = numpy.concatenate((linke_data[9:12],linke_data, linke_data[0:3]))
        monthDays = numpy.array ([0,31,28,31,30,31,30,31,31,30,31,30,31])
        midmonth_day = numpy.array ([0,0,0,0,0,0,0,0,0,0,0,0]) # create empty array to fill
        for i in range(1, 12+1):
                midmonth_day[i-1] = 15 + sum(monthDays[0:i])
        midmonth_day_wrap = numpy.concatenate((midmonth_day[9:12]-365, midmonth_day,midmonth_day[0:3]+365))
        linke = interpolate.interp1d(midmonth_day_wrap, linke_data_wrap, kind='cubic')
        lt = linke(day)
        return lt

def printout(str_text):
		timestamp = dt.datetime.strftime(dt.datetime.now(),"%H:%M:%S")
		lf.write(timestamp+": "+str_text+'\n')
		print timestamp+": "+str_text

def main():
        ##################################
        # INTRO
        # Multiprocessing
        global cores
        global lf
       	cores = mp.cpu_count()  # remove the -2 when r.horizon is finished. 4/7/2012 
       	cores = 2
       	
       	# Open log file
       	tlog = dt.datetime.strftime(dt.datetime.now(),"%Y-%m-%d_h%Hm%M")
       	lf = open('rsun_run_'+tlog+'.log', 'w')
       	# say hello
        printout("STARTING R.SUN MODELING RUN")
        printout("LOCATION: "+location)
        printout("This computer has "+str(cores)+" CPU cores.")
        # write params to log file
        lf.write('R0 pref: '+pref+'\n')
        lf.write('R0 dem: '+dem+'\n')
        lf.write('R0 can: '+can+'\n')
        lf.write('R2 horizon mapset: '+mhorizon+'\n')
        lf.write('R2 horizonstep: '+hstep+'\n')
        lf.write('R2 dist: '+dist+'\n')
        lf.write('R2 maxdistance: '+maxdistance+'\n')
        lf.write('R3 r.sun mapset: '+msun+'\n')
        lf.write('R3 linke_array: '+linke_array+'\n')
        lf.write('R3 timestep: '+timestep+'\n')
        lf.write('_________________________________\n')

        ##################################
        # R1. PREPROCESSING
        if( R1 == 1):
                R1starttime = dt.datetime.strftime(dt.datetime.now(),"%m-%d %H:%M:%S")
                printout('R1. START preprocessing at '+ str(R1starttime))
                
                preprocessing(pref,dem,can,default_mapset)
                
                R1endtime = dt.datetime.strftime(dt.datetime.now(),"%m-%d %H:%M:%S")
                R1processingtime = R1endtime - R1starttime 
                printout('R1. END preprocessing at '+ str(R1endtime)+', processing time: '+str(R1processingtime))
                
        ##################################
        # R2 HORIZON CALCULATIONS
        # 10m at 2012-03-26 16:55:47, processing time: 1:10:54
        # Had to rewrite this because r.horizon behaves erratically using multiprocessing.
        # now just runs in 2 processes, one for each dem.
        #
        if(R2 == 1):
                R2start = dt.datetime.now()
                R2starttime = dt.datetime.strftime(R2start,"%m-%d %H:%M:%S")
                printout('R2 START  r.horizon '+ R2starttime)

                # Create one temp directory for each CPU core
                printout("Creating Temporary directories, one per cpu core.")
                create_temp(cores)

                # Spawn process for r.horizon DEM
                pd = mp.Process(target=worker_horizon, args=(0,dem))
                pd.start()
                dpid = str(pd.pid)
                printout("R.Horizon: dem = "+dem+" process id = "+dpid)

                # Spawn process for r.horizon CAN
                pc = mp.Process(target=worker_horizon, args=(1,can))
                pc.start()
                cpid = str(pc.pid)
                printout("R.Horizon: dem = "+can+" process id = "+cpid)

                # Wait for all the Processes to finish
                pd.join()
                printout(dpid+" joined.")
                pc.joint()
                printout(cpid+" joined.")

                # Copy horizon maps from temp mapsets to horizon mapset
                copy_mapset('temp00',mhorizon,'hor',1)
                copy_mapset('temp01',mhorizon,'hor',1)
                                
                # Delete temp directories
                #remove_temp(cores)

                R2end = dt.datetime.now()
                R2endtime = dt.datetime.strftime(R2end,"%m-%d %H:%M:%S")
                R2processingtime = R2end - R2start
                printout('R2 END r.horizon, '+C+' at '+ R2endtime) + ', processing time: '+str(R2processingtime)
        
        ##################################
        # R3. R.SUN SOLAR MODEL
        if(R3 == 1):
                R3start = dt.datetime.now()
                R3starttime = dt.datetime.strftime(R3start,"%m-%d %H:%M:%S")
                printout('R3. START  '+ R3starttime)

                # Create one temp directory for each CPU core
                printout("Creating Temprary directories, one per cpu core.")
                create_temp(cores)
                """
                # Copy horizon maps into temp mapsets
                for cpu in range(0,cores):
                        mtemp = 'temp'+str(cpu).zfill(2)
                        copy_mapset(mhorizon, mtemp,'hor',0)
                """
                # Spawn process for r.horizon
                step = cores * week_step
                for demr in {'dem','can'}:
                        julian_seed = start_day
                        jobs = []
                        for cpu in range(0,cores):
                                p = mp.Process(target=worker_sun, args=(cpu,julian_seed,step,demr))
                                p.start()
                                jobs.append(p)
                                pid = str(p.pid)
                                printout("R3. r.sun: dem = "+demr+" cpu = "+str(cpu)+" julian_seed = "+str(julian_seed)+" pid = "+pid)
                                julian_seed += week_step

                        # Wait for all the Processes to finish		
                        for p in jobs:
                                pid = str(p.pid)
                                palive = str(p.is_alive)
                                printout(demr+" on "+pid+" is alive = "+palive)
                                p.join()
                                printout(pid+" joined.")
                        printout("R3. r.sun finished for "+demr)

                # Copy all the files back over to sun mapset 
                for cpu in range(0,cores):
                        mtemp = 'temp'+str(cpu).zfill(2)
                        suffixes = ['glob','beam','diff','refl','dur']
                        for suf in suffixes:
                                copy_mapset(mtemp, msun,suf,1)

                # Delete the temp mapsets
                #remove_temp(cores)

                # Finish
                R3end = dt.datetime.now()
                R3endtime = dt.datetime.strftime(R3end,"%m-%d %H:%M:%S")
                R3processingtime = R3end - R3start 
                printout('R3. END  at '+ R3endtime+ ', processing time: '+str(R3processingtime))

	lf.close()
	sys.exit("FINISHED.")

if __name__ == "__main__":
	#options, flags = grass.parser()
	main()

On Apr 1, 2012, at 5:10 AM, Daniel Lee wrote:

Hi list,

I've got a large area I'm running r.horizon on. Since the process takes quite a while but it only runs on one processor core and I've got plenty of RAM, I've decided to script it in Python and use run_command and start_command in order to create multiple horizon maps in parallel. Another reason I'm doing it that way is that I only need certain horizons, so I don't want GRASS to do all 360°.

The problem is, when I run the script, all of the maps that I want made seem to be made and I can see in top that there are several instances of r.horizon. But only the instance to be finished last is actually saved. Here's the output:

Calculating map 1 of 1 (angle 0.000000, raster map <horizon0_0>)
 100%
Calculating map 1 of 1 (angle 12.500000, raster map <horizon12.5_0>)
Calculating map 1 of 1 (angle 7.500000, raster map <horizon7.5_0>)
Calculating map 1 of 1 (angle 42.500000, raster map <horizon42.5_0>)
Calculating map 1 of 1 (angle 150.000000, raster map <horizon150.0_0>)
 100%
Calculating map 1 of 1 (angle 162.500000, raster map <horizon162.5_0>)
 100%
Calculating map 1 of 1 (angle 175.000000, raster map <horizon175.0_0>)
Calculating map 1 of 1 (angle 167.500000, raster map <horizon167.5_0>)
Calculating map 1 of 1 (angle 197.500000, raster map <horizon197.5_0>)
Calculating map 1 of 1 (angle 205.000000, raster map <horizon205.0_0>)
Calculating map 1 of 1 (angle 225.000000, raster map <horizon225.0_0>)
 100%
Calculating map 1 of 1 (angle 257.500000, raster map <horizon257.5_0>)
Calculating map 1 of 1 (angle 262.500000, raster map <horizon262.5_0>)
 100%
Calculating map 1 of 1 (angle 275.000000, raster map <horizon275.0_0>)
 100%
Calculating map 1 of 1 (angle 295.000000, raster map <horizon295.0_0>)
Calculating map 1 of 1 (angle 300.000000, raster map <horizon300.0_0>)
 100%
Calculating map 1 of 1 (angle 322.500000, raster map <horizon322.5_0>)
Calculating map 1 of 1 (angle 325.000000, raster map <horizon325.0_0>)
 100%
Calculating map 1 of 1 (angle 337.500000, raster map <horizon337.5_0>)
Calculating map 1 of 1 (angle 335.000000, raster map <horizon335.0_0>)
Calculating map 1 of 1 (angle 345.000000, raster map <horizon345.0_0>)
Calculating map 1 of 1 (angle 350.000000, raster map <horizon350.0_0>)
 100%
Calculating map 1 of 1 (angle 352.500000, raster map <horizon352.5_0>)
 100%
Calculating map 1 of 1 (angle 355.000000, raster map <horizon355.0_0>)
 100%
Calculating map 1 of 1 (angle 357.500000, raster map <horizon357.5_0>)
 100%
Calculating map 1 of 1 (angle 257.500000, raster map <horizon257.5_0>)

Calculating map 1 of 1 (angle 262.500000, raster map <horizon262.5_0>)
 100%
Calculating map 1 of 1 (angle 275.000000, raster map <horizon275.0_0>)
 100%
Calculating map 1 of 1 (angle 295.000000, raster map <horizon295.0_0>)
Calculating map 1 of 1 (angle 300.000000, raster map <horizon300.0_0>)
 100%
Calculating map 1 of 1 (angle 322.500000, raster map <horizon322.5_0>)
Calculating map 1 of 1 (angle 325.000000, raster map <horizon325.0_0>)
 100%
Calculating map 1 of 1 (angle 337.500000, raster map <horizon337.5_0>)
Calculating map 1 of 1 (angle 335.000000, raster map <horizon335.0_0>)
Calculating map 1 of 1 (angle 345.000000, raster map <horizon345.0_0>)
Calculating map 1 of 1 (angle 350.000000, raster map <horizon350.0_0>)
 100%
Calculating map 1 of 1 (angle 352.500000, raster map <horizon352.5_0>)
 100%
Calculating map 1 of 1 (angle 355.000000, raster map <horizon355.0_0>)
 100%
Calculating map 1 of 1 (angle 357.500000, raster map <horizon357.5_0>)
 100%

But when I do g.list rast at the end of it all I only have these maps:
raster files available in mapset <Cloppenburg>:
dom_aspect      horizon162.5_0  horizon325.0_0
dom_complete    horizon225.0_0  horizon350.0_0
dom_slope       horizon262.5_0  horizon352.5_0
horizon0_0      horizon275.0_0  horizon355.0_0
horizon150.0_0  horizon300.0_0  horizon357.5_0

As you can see, the only horizon maps saved are the last ones made. Does anybody know why this is the case and, if so, what would be a good workaround? Finding the horizons serially on only one processor means that my computer does a lot of sitting around twiddling thumbs. Thanks a bunch!

Daniel

--

B.Sc. Daniel Lee
Geschäftsführung für Forschung und Entwicklung
ISIS - International Solar Information Solutions GbR
Vertreten durch: Daniel Lee, Nepomuk Reinhard und Nils Räder

Deutschhausstr. 10
35037 Marburg
Festnetz: +49 6421 379 6256
Mobil: +49 176 6127 7269
E-Mail: l...@isi-solutions.org
Web: http://www.isi-solutions.org

ISIS wird gefördert durch die Bundesrepublik Deutschland, Zuwendungsgeber: Bundesministerium für Wirtschaft und Technologie aufgrund eines Beschlusses des Deutschen Bundestages, sowie durch die Europäische Union, Zuwendungsgeber: Europäischer Sozialfonds.
Zusätzliche Unterstützung erhält ISIS von dem Entrepreneurship Cluster Mittelhessen, der Universität Marburg, dem Laboratory for Climatology and Remote Sensing und dem GIS-Lab Marburg.


_______________________________________________
grass-user mailing list
grass-user@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/grass-user

_______________________________________________
grass-user mailing list
grass-user@lists.osgeo.org
http://lists.osgeo.org/mailman/listinfo/grass-user

Reply via email to